Sample records for unique modeling techniques

  1. The application of a unique flow modeling technique to complex combustion systems

    NASA Astrophysics Data System (ADS)

    Waslo, J.; Hasegawa, T.; Hilt, M. B.

    1986-06-01

    This paper describes the application of a unique three-dimensional water flow modeling technique to the study of complex fluid flow patterns within an advanced gas turbine combustor. The visualization technique uses light scattering, coupled with real-time image processing, to determine flow fields. Additional image processing is used to make concentration measurements within the combustor.

  2. Equivalence and Differences between Structural Equation Modeling and State-Space Modeling Techniques

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Ho, Moon-ho R.; Hamaker, Ellen L.; Dolan, Conor V.

    2010-01-01

    State-space modeling techniques have been compared to structural equation modeling (SEM) techniques in various contexts but their unique strengths have often been overshadowed by their similarities to SEM. In this article, we provide a comprehensive discussion of these 2 approaches' similarities and differences through analytic comparisons and…

  3. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  4. Calibration of Complex Subsurface Reaction Models Using a Surrogate-Model Approach

    EPA Science Inventory

    Application of model assessment techniques to complex subsurface reaction models involves numerous difficulties, including non-trivial model selection, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study introduces SAMM (Simult...

  5. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  6. Conventional and Piecewise Growth Modeling Techniques: Applications and Implications for Investigating Head Start Children's Early Literacy Learning

    ERIC Educational Resources Information Center

    Hindman, Annemarie H.; Cromley, Jennifer G.; Skibbe, Lori E.; Miller, Alison L.

    2011-01-01

    This article reviews the mechanics of conventional and piecewise growth models to demonstrate the unique affordances of each technique for examining the nature and predictors of children's early literacy learning during the transition from preschool through first grade. Using the nationally representative Family and Child Experiences Survey…

  7. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  8. SELECTION AND CALIBRATION OF SUBSURFACE REACTIVE TRANSPORT MODELS USING A SURROGATE-MODEL APPROACH

    EPA Science Inventory

    While standard techniques for uncertainty analysis have been successfully applied to groundwater flow models, extension to reactive transport is frustrated by numerous difficulties, including excessive computational burden and parameter non-uniqueness. This research introduces a...

  9. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  10. Reducing uncertainties in the velocities determined by inversion of phase velocity dispersion curves using synthetic seismograms

    NASA Astrophysics Data System (ADS)

    Hosseini, Seyed Mehrdad

    Characterizing the near-surface shear-wave velocity structure using Rayleigh-wave phase velocity dispersion curves is widespread in the context of reservoir characterization, exploration seismology, earthquake engineering, and geotechnical engineering. This surface seismic approach provides a feasible and low-cost alternative to the borehole measurements. Phase velocity dispersion curves from Rayleigh surface waves are inverted to yield the vertical shear-wave velocity profile. A significant problem with the surface wave inversion is its intrinsic non-uniqueness, and although this problem is widely recognized, there have not been systematic efforts to develop approaches to reduce the pervasive uncertainty that affects the velocity profiles determined by the inversion. Non-uniqueness cannot be easily studied in a nonlinear inverse problem such as Rayleigh-wave inversion and the only way to understand its nature is by numerical investigation which can get computationally expensive and inevitably time consuming. Regarding the variety of the parameters affecting the surface wave inversion and possible non-uniqueness induced by them, a technique should be established which is not controlled by the non-uniqueness that is already affecting the surface wave inversion. An efficient and repeatable technique is proposed and tested to overcome the non-uniqueness problem; multiple inverted shear-wave velocity profiles are used in a wavenumber integration technique to generate synthetic time series resembling the geophone recordings. The similarity between synthetic and observed time series is used as an additional tool along with the similarity between the theoretical and experimental dispersion curves. The proposed method is proven to be effective through synthetic and real world examples. In these examples, the nature of the non-uniqueness is discussed and its existence is shown. Using the proposed technique, inverted velocity profiles are estimated and effectiveness of this technique is evaluated; in the synthetic example, final inverted velocity profile is compared with the initial target velocity model, and in the real world example, final inverted shear-wave velocity profile is compared with the velocity model from independent measurements in a nearby borehole. Real world example shows that it is possible to overcome the non-uniqueness and distinguish the representative velocity profile for the site that also matches well with the borehole measurements.

  11. Dragon pulse information management system (DPIMS): A unique model-based approach to implementing domain agnostic system of systems and behaviors

    NASA Astrophysics Data System (ADS)

    Anderson, Thomas S.

    2016-05-01

    The Global Information Network Architecture is an information technology based on Vector Relational Data Modeling, a unique computational paradigm, DoD network certified by USARMY as the Dragon Pulse Informa- tion Management System. This network available modeling environment for modeling models, where models are configured using domain relevant semantics and use network available systems, sensors, databases and services as loosely coupled component objects and are executable applications. Solutions are based on mission tactics, techniques, and procedures and subject matter input. Three recent ARMY use cases are discussed a) ISR SoS. b) Modeling and simulation behavior validation. c) Networked digital library with behaviors.

  12. Generalized image contrast enhancement technique based on the Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1996-07-01

    This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.

  13. New commercial opportunities for advanced reproductive technologies in horses, wildlife, and companion animals.

    PubMed

    Long, C R; Walker, S C; Tang, R T; Westhusin, M E

    2003-01-01

    As advanced reproductive technologies become more efficient and repeatable in livestock and laboratory species, new opportunities will evolve to apply these techniques to alternative and non-traditional species. This will result in new markets requiring unique business models that address issues of animal welfare and consumer acceptance on a much different level than the livestock sector. Advanced reproductive technologies and genetic engineering will be applied to each species in innovative ways to provide breeders more alternatives for the preservation and propagation of elite animals in each sector. The commercialization of advanced reproductive techniques in these niche markets should be considered a useful tool for conservation of genetic material from endangered or unique animals as well as production of biomedical models of human disease. Copyright 2002 Elsevier Science Inc.

  14. Uncovering multiple pathways to substance use: a comparison of methods for identifying population subgroups.

    PubMed

    Dierker, Lisa; Rose, Jennifer; Tan, Xianming; Li, Runze

    2010-12-01

    This paper describes and compares a selection of available modeling techniques for identifying homogeneous population subgroups in the interest of informing targeted substance use intervention. We present a nontechnical review of the common and unique features of three methods: (a) trajectory analysis, (b) functional hierarchical linear modeling (FHLM), and (c) decision tree methods. Differences among the techniques are described, including required data features, strengths and limitations in terms of the flexibility with which outcomes and predictors can be modeled, and the potential of each technique for helping to inform the selection of targets and timing of substance intervention programs.

  15. String theory--the physics of string-bending and other electric guitar techniques.

    PubMed

    Grimes, David Robert

    2014-01-01

    Electric guitar playing is ubiquitous in practically all modern music genres. In the hands of an experienced player, electric guitars can sound as expressive and distinct as a human voice. Unlike other more quantised instruments where pitch is a discrete function, guitarists can incorporate micro-tonality and, as a result, vibrato and sting-bending are idiosyncratic hallmarks of a player. Similarly, a wide variety of techniques unique to the electric guitar have emerged. While the mechano-acoustics of stringed instruments and vibrating strings are well studied, there has been comparatively little work dedicated to the underlying physics of unique electric guitar techniques and strings, nor the mechanical factors influencing vibrato, string-bending, fretting force and whammy-bar dynamics. In this work, models for these processes are derived and the implications for guitar and string design discussed. The string-bending model is experimentally validated using a variety of strings and vibrato dynamics are simulated. The implications of these findings on the configuration and design of guitars is also discussed.

  16. String Theory - The Physics of String-Bending and Other Electric Guitar Techniques

    PubMed Central

    Grimes, David Robert

    2014-01-01

    Electric guitar playing is ubiquitous in practically all modern music genres. In the hands of an experienced player, electric guitars can sound as expressive and distinct as a human voice. Unlike other more quantised instruments where pitch is a discrete function, guitarists can incorporate micro-tonality and, as a result, vibrato and sting-bending are idiosyncratic hallmarks of a player. Similarly, a wide variety of techniques unique to the electric guitar have emerged. While the mechano-acoustics of stringed instruments and vibrating strings are well studied, there has been comparatively little work dedicated to the underlying physics of unique electric guitar techniques and strings, nor the mechanical factors influencing vibrato, string-bending, fretting force and whammy-bar dynamics. In this work, models for these processes are derived and the implications for guitar and string design discussed. The string-bending model is experimentally validated using a variety of strings and vibrato dynamics are simulated. The implications of these findings on the configuration and design of guitars is also discussed. PMID:25054880

  17. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  18. The Python Project: A Unique Model for Extending Research Opportunities to Undergraduate Students

    ERIC Educational Resources Information Center

    Harvey, Pamela A.; Wall, Christopher; Luckey, Stephen W.; Langer, Stephen; Leinwand, Leslie A.

    2014-01-01

    Undergraduate science education curricula are traditionally composed of didactic instruction with a small number of laboratory courses that provide introductory training in research techniques. Research on learning methodologies suggests this model is relatively ineffective, whereas participation in independent research projects promotes enhanced…

  19. Learning Compositional Simulation Models

    DTIC Science & Technology

    2010-01-01

    techniques developed by social scientists, economists, and medical researchers over the past four decades. Quasi-experimental designs (QEDs) are...statistical techniques from the social sciences known as quasi- experimental design (QED). QEDs allow a researcher to exploit unique characteristics...can be grouped under the rubric “quasi-experimental design ” (QED), and they attempt to exploit inherent characteristics of observational data sets

  20. Operational Leadership in the Information Age: A New Model

    DTIC Science & Technology

    2000-02-08

    of operational leadership and offers the individual a tool for development as well as for analyzing unique leadership situation% and thinking about the most appropriate balance of leadership styles and techniques.

  1. Update to core reporting practices in structural equation modeling.

    PubMed

    Schreiber, James B

    This paper is a technical update to "Core Reporting Practices in Structural Equation Modeling." 1 As such, the content covered in this paper includes, sample size, missing data, specification and identification of models, estimation method choices, fit and residual concerns, nested, alternative, and equivalent models, and unique issues within the SEM family of techniques. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Study of 3D bathymetry modelling using LAPAN Surveillance Unmanned Aerial Vehicle 02 (LSU-02) photo data with stereo photogrammetry technique, Wawaran Beach, Pacitan, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Sari, N. M.; Nugroho, J. T.; Chulafak, G. A.; Kushardono, D.

    2018-05-01

    Coastal is an ecosystem that has unique object and phenomenon. The potential of the aerial photo data with very high spatial resolution covering coastal area is extensive. One of the aerial photo data can be used is LAPAN Surveillance UAV 02 (LSU-02) photo data which is acquired in 2016 with a spatial resolution reaching 10cm. This research aims to create an initial bathymetry model with stereo photogrammetry technique using LSU-02 data. In this research the bathymetry model was made by constructing 3D model with stereo photogrammetry technique that utilizes the dense point cloud created from overlapping of those photos. The result shows that the 3D bathymetry model can be built with stereo photogrammetry technique. It can be seen from the surface and bathymetry transect profile.

  3. 4D-tomographic reconstruction of water vapor using the hybrid regularization technique with application to the North West of Iran

    NASA Astrophysics Data System (ADS)

    Adavi, Zohre; Mashhadi-Hossainali, Masoud

    2015-04-01

    Water vapor is considered as one of the most important weather parameter in meteorology. Its non-uniform distribution, which is due to the atmospheric phenomena above the surface of the earth, depends both on space and time. Due to the limited spatial and temporal coverage of observations, estimating water vapor is still a challenge in meteorology and related fields such as positioning and geodetic techniques. Tomography is a method for modeling the spatio-temporal variations of this parameter. By analyzing the impact of troposphere on the Global Navigation Satellite (GNSS) signals, inversion techniques are used for modeling the water vapor in this approach. Non-uniqueness and instability of solution are the two characteristic features of this problem. Horizontal and/or vertical constraints are usually used to compute a unique solution for this problem. Here, a hybrid regularization method is used for computing a regularized solution. The adopted method is based on the Least-Square QR (LSQR) and Tikhonov regularization techniques. This method benefits from the advantages of both the iterative and direct techniques. Moreover, it is independent of initial values. Based on this property and using an appropriate resolution for the model, firstly the number of model elements which are not constrained by GPS measurement are minimized and then; water vapor density is only estimated at the voxels which are constrained by these measurements. In other words, no constraint is added to solve the problem. Reconstructed profiles of water vapor are validated using radiosonde measurements.

  4. On the stochastic dissemination of faults in an admissible network

    NASA Technical Reports Server (NTRS)

    Kyrala, A.

    1987-01-01

    The dynamic distribution of faults in a general type network is discussed. The starting point is a uniquely branched network in which each pair of nodes is connected by a single branch. Mathematical expressions for the uniquely branched network transition matrix are derived to show that sufficient stationarity exists to ensure the validity of the use of the Markov Chain model to analyze networks. In addition the conditions for the use of Semi-Markov models are discussed. General mathematical expressions are derived in an examination of branch redundancy techniques commonly used to increase reliability.

  5. Cultural Models of Domestic Violence: Perspectives of Social Work and Anthropology Students

    ERIC Educational Resources Information Center

    Collins, Cyleste C.; Dressler, William W.

    2008-01-01

    This study employed a unique theoretical approach and a series of participant-based ethnographic interviewing techniques that are traditionally used in cognitive anthropology to examine and compare social work and anthropology students' cultural models of the causes of domestic violence. The study findings indicate that although social work…

  6. Investigation of the effects of external current systems on the MAGSAT data utilizing grid cell modeling techniques

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M. (Principal Investigator)

    1982-01-01

    Progress made in reducing MAGSAT data and displaying magnetic field perturbations caused primarily by external currents is reported. A periodic and repeatable perturbation pattern is described that arises from external current effects but appears as unique signatures associated with upper middle latitudes on the Earth's surface. Initial testing of the modeling procedure that was developed to compute the magnetic fields at satellite orbit due to current distributions in the ionosphere and magnetosphere is also discussed. The modeling technique utilizes a linear current element representation of the large scale space current system.

  7. Application of Chemistry in Materials Research at NASA GRC

    NASA Technical Reports Server (NTRS)

    Kavandi, Janet L.

    2016-01-01

    Overview of NASA GRC Materials Development. New materials enabled by new chemistries offering unique properties and chemical processing techniques. Durability of materials in harsh environments requires understanding and modeling of chemical interaction of materials with the environment.

  8. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  9. Calibrating White Dwarf Asteroseismic Fitting Techniques

    NASA Astrophysics Data System (ADS)

    Castanheira, B. G.; Romero, A. D.; Bischoff-Kim, A.

    2017-03-01

    The main goal of looking for intrinsic variability in stars is the unique opportunity to study their internal structure. Once we have extracted independent modes from the data, it appears to be a simple matter of comparing the period spectrum with those from theoretical model grids to learn the inner structure of that star. However, asteroseismology is much more complicated than this simple description. We must account not only for observational uncertainties in period determination, but most importantly for the limitations of the model grids, coming from the uncertainties in the constitutive physics, and of the fitting techniques. In this work, we will discuss results of numerical experiments where we used different independently calculated model grids (white dwarf cooling models WDEC and fully evolutionary LPCODE-PUL) and fitting techniques to fit synthetic stars. The advantage of using synthetic stars is that we know the details of their interior structure so we can assess how well our models and fitting techniques are able to the recover the interior structure, as well as the stellar parameters.

  10. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  11. Modeling and Simulation of Nanoindentation

    NASA Astrophysics Data System (ADS)

    Huang, Sixie; Zhou, Caizhi

    2017-11-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  12. A system identification technique based on the random decrement signatures. Part 2: Experimental results

    NASA Technical Reports Server (NTRS)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The results of an experiment conducted on an offshore platform scale model to verify the validity of the technique and to demonstrate its application in damage detection are presented.

  13. Zebrafish Models of Human Leukemia: Technological Advances and Mechanistic Insights.

    PubMed

    Harrison, Nicholas R; Laroche, Fabrice J F; Gutierrez, Alejandro; Feng, Hui

    2016-01-01

    Insights concerning leukemic pathophysiology have been acquired in various animal models and further efforts to understand the mechanisms underlying leukemic treatment resistance and disease relapse promise to improve therapeutic strategies. The zebrafish (Danio rerio) is a vertebrate organism with a conserved hematopoietic program and unique experimental strengths suiting it for the investigation of human leukemia. Recent technological advances in zebrafish research including efficient transgenesis, precise genome editing, and straightforward transplantation techniques have led to the generation of a number of leukemia models. The transparency of the zebrafish when coupled with improved lineage-tracing and imaging techniques has revealed exquisite details of leukemic initiation, progression, and regression. With these advantages, the zebrafish represents a unique experimental system for leukemic research and additionally, advances in zebrafish-based high-throughput drug screening promise to hasten the discovery of novel leukemia therapeutics. To date, investigators have accumulated knowledge of the genetic underpinnings critical to leukemic transformation and treatment resistance and without doubt, zebrafish are rapidly expanding our understanding of disease mechanisms and helping to shape therapeutic strategies for improved outcomes in leukemic patients.

  14. Zebrafish Models of Human Leukemia: Technological Advances and Mechanistic Insights

    PubMed Central

    Harrison, Nicholas R.; Laroche, Fabrice J.F.; Gutierrez, Alejandro

    2016-01-01

    Insights concerning leukemic pathophysiology have been acquired in various animal models and further efforts to understand the mechanisms underlying leukemic treatment resistance and disease relapse promise to improve therapeutic strategies. The zebrafish (Danio rerio) is a vertebrate organism with a conserved hematopoietic program and unique experimental strengths suiting it for the investigation of human leukemia. Recent technological advances in zebrafish research including efficient transgenesis, precise genome editing, and straightforward transplantation techniques have led to the generation of a number of leukemia models. The transparency of the zebrafish when coupled with improved lineage-tracing and imaging techniques has revealed exquisite details of leukemic initiation, progression, and regression. With these advantages, the zebrafish represents a unique experimental system for leukemic research and additionally, advances in zebrafish-based high-throughput drug screening promise to hasten the discovery of novel leukemia therapeutics. To date, investigators have accumulated knowledge of the genetic underpinnings critical to leukemic transformation and treatment resistance and without doubt, zebrafish are rapidly expanding our understanding of disease mechanisms and helping to shape therapeutic strategies for improved outcomes in leukemic patients. PMID:27165361

  15. Optimization and analysis of large chemical kinetic mechanisms using the solution mapping method - Combustion of methane

    NASA Technical Reports Server (NTRS)

    Frenklach, Michael; Wang, Hai; Rabinowitz, Martin J.

    1992-01-01

    A method of systematic optimization, solution mapping, as applied to a large-scale dynamic model is presented. The basis of the technique is parameterization of model responses in terms of model parameters by simple algebraic expressions. These expressions are obtained by computer experiments arranged in a factorial design. The developed parameterized responses are then used in a joint multiparameter multidata-set optimization. A brief review of the mathematical background of the technique is given. The concept of active parameters is discussed. The technique is applied to determine an optimum set of parameters for a methane combustion mechanism. Five independent responses - comprising ignition delay times, pre-ignition methyl radical concentration profiles, and laminar premixed flame velocities - were optimized with respect to thirteen reaction rate parameters. The numerical predictions of the optimized model are compared to those computed with several recent literature mechanisms. The utility of the solution mapping technique in situations where the optimum is not unique is also demonstrated.

  16. Forensic archaeology and anthropology : An Australian perspective.

    PubMed

    Oakley, Kate

    2005-09-01

    Forensic archaeology is an extremely powerful investigative discipline and, in combination with forensic anthropology, can provide a wealth of evidentiary information to police investigators and the forensic community. The re-emergence of forensic archaeology and anthropology within Australia relies on its diversification and cooperation with established forensic medical organizations, law enforcement forensic service divisions, and national forensic boards. This presents a unique opportunity to develop a new multidisciplinary approach to forensic archaeology/anthropology within Australia as we hold a unique set of environmental, social, and cultural conditions that diverge from overseas models and require different methodological approaches. In the current world political climate, more forensic techniques are being applied at scenes of mass disasters, genocide, and terrorism. This provides Australian forensic archaeology/anthropology with a unique opportunity to develop multidisciplinary models with contributions from psychological profiling, ballistics, sociopolitics, cultural anthropology, mortuary technicians, post-blast analysis, fire analysis, and other disciplines from the world of forensic science.

  17. Perinatal Psychoneuroimmunology: Protocols for the Study of Prenatal Stress and Its Effects on Fetal and Postnatal Brain Development.

    PubMed

    Frasch, Martin G; Baier, Carlos J; Antonelli, Marta C; Metz, Gerlinde A S

    2018-01-01

    Prenatal stress (PS) impacts early behavioral, neuroimmune, and cognitive development. Pregnant rat models have been very valuable in examining the mechanisms of such fetal programming. A newer pregnant sheep model of maternal stress offers the unique advantages of chronic in utero monitoring and manipulation. This chapter presents the techniques used to model single and multigenerational stress exposures and their pleiotropic effects on the offspring.

  18. Issues concerning the updating of finite-element models from experimental data

    NASA Technical Reports Server (NTRS)

    Dunn, Shane A.

    1994-01-01

    Some issues concerning the updating of dynamic finite-element models by incorporation of experimental data are examined here. It is demonstrated how the number of unknowns can be greatly reduced if the physical nature of the model is maintained. The issue of uniqueness is also examined and it is shown that a number of previous workers have been mistaken in their attempts to define both sufficient and necessary measurement requirements for the updating problem to be solved uniquely. The relative merits of modal and frequency response function (frf) data are discussed and it is shown that for measurements at fewer degrees of freedom than are present in the model, frf data will be unlikely to converge easily to a solution. It is then examined how such problems may become more tractable by using new experimental techniques which would allow measurements at all degrees of freedom present in the mathematical model.

  19. The Use of Podcasts to Enhance Narrative Writing Skills

    ERIC Educational Resources Information Center

    Qaddour, Kinana

    2017-01-01

    This activity uses podcasts to model narrative writing techniques. The challenges students face when exercising narrative writing skills are unique when compared to those of persuasive and expository writing; my students have repeatedly expressed their qualms with articulating experiences that engage their audience. Although students have…

  20. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  1. Using Java to generate globally unique identifiers for DICOM objects.

    PubMed

    Kamauu, Aaron W C; Duvall, Scott L; Avrin, David E

    2009-03-01

    Digital imaging and communication in medicine (DICOM) specifies that all DICOM objects have globally unique identifiers (UIDs). Creating these UIDs can be a difficult task due to the variety of techniques in use and the requirement to ensure global uniqueness. We present a simple technique of combining a root organization identifier, assigned descriptive identifiers, and JAVA generated unique identifiers to construct DICOM compliant UIDs.

  2. Modeling Multi-wavelength Stellar Astrometry. III. Determination of the Absolute Masses of Exoplanets and Their Host Stars

    NASA Astrophysics Data System (ADS)

    Coughlin, J. L.; López-Morales, Mercedes

    2012-05-01

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 μas precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  3. A review of experimental and modeling techniques to determine properties of biopolymer-based nanocomposites

    USDA-ARS?s Scientific Manuscript database

    The nonbiodegradable and nonrenewable nature of plastic packaging has led to a renewed interest in packaging materials based on bio-nanocomposites (biopolymer matrix reinforced with nanoparticles such as layered silicates). One of the reasons for unique properties of bio-nanocomposites is the differ...

  4. RAISED between Cultures: New Resources for Working with Children of Immigrant or Refugee Background

    ERIC Educational Resources Information Center

    Brosinsky, Larissa; Georgis, Rebecca; Gokiert, Rebecca; Mejia, Teresa; Kirova, Anna

    2018-01-01

    The pressing needs of populations with unique challenges, such as immigrants or refugees, often stimulate important innovation in development of educational techniques and resources. This article highlights the RAISED between Cultures model, a conceptual framework for understanding children's experiences holistically and promoting intercultural…

  5. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  6. Transportation Systems Evaluation

    NASA Technical Reports Server (NTRS)

    Fanning, M. L.; Michelson, R. A.

    1972-01-01

    A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.

  7. Local numerical modelling of ultrasonic guided waves in linear and nonlinear media

    NASA Astrophysics Data System (ADS)

    Packo, Pawel; Radecki, Rafal; Kijanka, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.

    2017-04-01

    Nonlinear ultrasonic techniques provide improved damage sensitivity compared to linear approaches. The combination of attractive properties of guided waves, such as Lamb waves, with unique features of higher harmonic generation provides great potential for characterization of incipient damage, particularly in plate-like structures. Nonlinear ultrasonic structural health monitoring techniques use interrogation signals at frequencies other than the excitation frequency to detect changes in structural integrity. Signal processing techniques used in non-destructive evaluation are frequently supported by modeling and numerical simulations in order to facilitate problem solution. This paper discusses known and newly-developed local computational strategies for simulating elastic waves, and attempts characterization of their numerical properties in the context of linear and nonlinear media. A hybrid numerical approach combining advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE) is proposed for unique treatment of arbitrary strain-stress relations. The iteration equations of the method are derived directly from physical principles employing stress and displacement continuity, leading to an accurate description of the propagation in arbitrarily complex media. Numerical analysis of guided wave propagation, based on the newly developed hybrid approach, is presented and discussed in the paper for linear and nonlinear media. Comparisons to Finite Elements (FE) are also discussed.

  8. Investigation of finite element: ABC methods for electromagnetic field simulation. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Volakis, John L.; Nguyen, J.

    1994-01-01

    The mechanics of wave propagation in the presence of obstacles is of great interest in many branches of engineering and applied mathematics like electromagnetics, fluid dynamics, geophysics, seismology, etc. Such problems can be broadly classified into two categories: the bounded domain or the closed problem and the unbounded domain or the open problem. Analytical techniques have been derived for the simpler problems; however, the need to model complicated geometrical features, complex material coatings and fillings, and to adapt the model to changing design parameters have inevitably tilted the balance in favor of numerical techniques. The modeling of closed problems presents difficulties primarily in proper meshing of the interior region. However, problems in unbounded domains pose a unique challenge to computation, since the exterior region is inappropriate for direct implementation of numerical techniques. A large number of solutions have been proposed but only a few have stood the test of time and experiment. The goal of this thesis is to develop an efficient and reliable partial differential equation technique to model large three dimensional scattering problems in electromagnetics.

  9. Conditional clustering of temporal expression profiles

    PubMed Central

    Wang, Ling; Montano, Monty; Rarick, Matt; Sebastiani, Paola

    2008-01-01

    Background Many microarray experiments produce temporal profiles in different biological conditions but common cluster techniques are not able to analyze the data conditional on the biological conditions. Results This article presents a novel technique to cluster data from time course microarray experiments performed across several experimental conditions. Our algorithm uses polynomial models to describe the gene expression patterns over time, a full Bayesian approach with proper conjugate priors to make the algorithm invariant to linear transformations, and an iterative procedure to identify genes that have a common temporal expression profile across two or more experimental conditions, and genes that have a unique temporal profile in a specific condition. Conclusion We use simulated data to evaluate the effectiveness of this new algorithm in finding the correct number of clusters and in identifying genes with common and unique profiles. We also use the algorithm to characterize the response of human T cells to stimulations of antigen-receptor signaling gene expression temporal profiles measured in six different biological conditions and we identify common and unique genes. These studies suggest that the methodology proposed here is useful in identifying and distinguishing uniquely stimulated genes from commonly stimulated genes in response to variable stimuli. Software for using this clustering method is available from the project home page. PMID:18334028

  10. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  11. Humanistic Wellness Services for Community Mental Health Providers

    ERIC Educational Resources Information Center

    Carney, Jolynn V.

    2007-01-01

    The author examines the unique ability of mental health providers to offer humanistic services in a highly competitive atmosphere by using a wellness approach. J. E. Myers and T. J. Sweeney's (2005) 5 second-order factors are offered as a conceptual model. Therapeutic techniques and humanizing benefits for individuals, families, and communities…

  12. Blending Arts and Academics with a Community Spirit

    ERIC Educational Resources Information Center

    Crawford, Jennifer; Roberts, Patricia

    2013-01-01

    AIM Academy of Conshohocken, Pennsylvania is a K-12 school modeled after the Lab School of Washington. AIM promises its 200 students and their parents a unique and innovative learning environment. Applying the latest techniques proven to help children who learn differently, AIM Academy taps into students' talents in ways that are both fully…

  13. The Rayleigh curve as a model for effort distribution over the life of medium scale software systems. M.S. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Picasso, G. O.; Basili, V. R.

    1982-01-01

    It is noted that previous investigations into the applicability of Rayleigh curve model to medium scale software development efforts have met with mixed results. The results of these investigations are confirmed by analyses of runs and smoothing. The reasons for the models' failure are found in the subcycle effort data. There are four contributing factors: uniqueness of the environment studied, the influence of holidays, varying management techniques and differences in the data studied.

  14. A stabilized MFE reduced-order extrapolation model based on POD for the 2D unsteady conduction-convection problem.

    PubMed

    Xia, Hong; Luo, Zhendong

    2017-01-01

    In this study, we devote ourselves to establishing a stabilized mixed finite element (MFE) reduced-order extrapolation (SMFEROE) model holding seldom unknowns for the two-dimensional (2D) unsteady conduction-convection problem via the proper orthogonal decomposition (POD) technique, analyzing the existence and uniqueness and the stability as well as the convergence of the SMFEROE solutions and validating the correctness and dependability of the SMFEROE model by means of numerical simulations.

  15. Predicting cyanobacterial abundance, microcystin, and geosmin in a eutrophic drinking-water reservoir using a 14-year dataset

    USGS Publications Warehouse

    Harris, Ted D.; Graham, Jennifer L.

    2017-01-01

    Cyanobacterial blooms degrade water quality in drinking water supply reservoirs by producing toxic and taste-and-odor causing secondary metabolites, which ultimately cause public health concerns and lead to increased treatment costs for water utilities. There have been numerous attempts to create models that predict cyanobacteria and their secondary metabolites, most using linear models; however, linear models are limited by assumptions about the data and have had limited success as predictive tools. Thus, lake and reservoir managers need improved modeling techniques that can accurately predict large bloom events that have the highest impact on recreational activities and drinking-water treatment processes. In this study, we compared 12 unique linear and nonlinear regression modeling techniques to predict cyanobacterial abundance and the cyanobacterial secondary metabolites microcystin and geosmin using 14 years of physiochemical water quality data collected from Cheney Reservoir, Kansas. Support vector machine (SVM), random forest (RF), boosted tree (BT), and Cubist modeling techniques were the most predictive of the compared modeling approaches. SVM, RF, and BT modeling techniques were able to successfully predict cyanobacterial abundance, microcystin, and geosmin concentrations <60,000 cells/mL, 2.5 µg/L, and 20 ng/L, respectively. Only Cubist modeling predicted maxima concentrations of cyanobacteria and geosmin; no modeling technique was able to predict maxima microcystin concentrations. Because maxima concentrations are a primary concern for lake and reservoir managers, Cubist modeling may help predict the largest and most noxious concentrations of cyanobacteria and their secondary metabolites.

  16. Implementation of a finite element analysis procedure for structural analysis of shape memory behaviour of fibre reinforced shape memory polymer composites

    NASA Astrophysics Data System (ADS)

    Azzawi, Wessam Al; Epaarachchi, J. A.; Islam, Mainul; Leng, Jinsong

    2017-12-01

    Shape memory polymers (SMPs) offer a unique ability to undergo a substantial shape deformation and subsequently recover the original shape when exposed to a particular external stimulus. Comparatively low mechanical properties being the major drawback for extended use of SMPs in engineering applications. However the inclusion of reinforcing fibres in to SMPs improves mechanical properties significantly while retaining intrinsic shape memory effects. The implementation of shape memory polymer composites (SMPCs) in any engineering application is a unique task which requires profound materials and design optimization. However currently available analytical tools have critical limitations to undertake accurate analysis/simulations of SMPC structures and slower derestrict transformation of breakthrough research outcomes to real-life applications. Many finite element (FE) models have been presented. But majority of them require a complicated user-subroutines to integrate with standard FE software packages. Furthermore, those subroutines are problem specific and difficult to use for a wider range of SMPC materials and related structures. This paper presents a FE simulation technique to model the thermomechanical behaviour of the SMPCs using commercial FE software ABAQUS. Proposed technique incorporates material time-dependent viscoelastic behaviour. The ability of the proposed technique to predict the shape fixity and shape recovery was evaluated by experimental data acquired by a bending of a SMPC cantilever beam. The excellent correlation between the experimental and FE simulation results has confirmed the robustness of the proposed technique.

  17. Duality based direct resolution of unique profiles using zero concentration region information.

    PubMed

    Tavakkoli, Elnaz; Rajkó, Róbert; Abdollahi, Hamid

    2018-07-01

    Self Modeling Curve Resolution (SMCR) is a class of techniques concerned with estimating pure profiles underlying a set of measurements on chemical systems. In general, the estimated profiles are ambiguous (non-unique) except if some special conditions fulfilled. Implementing the adequate information can reduce the so-called rotational ambiguity effectively, and in the most desirable cases lead to the unique solution. Therefore, studies on circumstances resulting in unique solution are of particular importance. The conditions of unique solution can particularly be studied based on duality principle. In bilinear chemical (e.g., spectroscopic) data matrix, there is a natural duality between its row and column vector spaces using minimal constraints (non-negativity of concentrations and absorbances). In this article, the conditions of the unique solution according to duality concept and using zero concentration region information is intended to show. A simulated dataset of three components and an experimental system with synthetic mixtures containing three amino acids tyrosine, phenylalanine and tryptophan are analyzed. It is shown that in the presence of sufficient information, the reliable unique solution is obtained that is valuable in analytical qualification and for quantitative verification analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. The relevance of Newton's laws and selected principles of physics to dance techniques: Theory and application

    NASA Astrophysics Data System (ADS)

    Lei, Li

    1999-07-01

    In this study the researcher develops and presents a new model, founded on the laws of physics, for analyzing dance technique. Based on a pilot study of four advanced dance techniques, she creates a new model for diagnosing, analyzing and describing basic, intermediate and advanced dance techniques. The name for this model is ``PED,'' which stands for Physics of Expressive Dance. The research design consists of five phases: (1) Conduct a pilot study to analyze several advanced dance techniques chosen from Chinese dance, modem dance, and ballet; (2) Based on learning obtained from the pilot study, create the PED Model for analyzing dance technique; (3) Apply this model to eight categories of dance technique; (4) Select two advanced dance techniques from each category and analyze these sample techniques to demonstrate how the model works; (5) Develop an evaluation framework and use it to evaluate the effectiveness of the model, taking into account both scientific and artistic aspects of dance training. In this study the researcher presents new solutions to three problems highly relevant to dance education: (1) Dancers attempting to learn difficult movements often fail because they are unaware of physics laws; (2) Even those who do master difficult movements can suffer injury due to incorrect training methods; (3) Even the best dancers can waste time learning by trial and error, without scientific instruction. In addition, the researcher discusses how the application of the PED model can benefit dancers, allowing them to avoid inefficient and ineffective movements and freeing them to focus on the artistic expression of dance performance. This study is unique, presenting the first comprehensive system for analyzing dance techniques in terms of physics laws. The results of this study are useful, allowing a new level of awareness about dance techniques that dance professionals can utilize for more effective and efficient teaching and learning. The approach utilized in this study is universal, and can be applied to any dance movement and to any dance style.

  19. Linear Water Waves

    NASA Astrophysics Data System (ADS)

    Kuznetsov, N.; Maz'ya, V.; Vainberg, B.

    2002-08-01

    This book gives a self-contained and up-to-date account of mathematical results in the linear theory of water waves. The study of waves has many applications, including the prediction of behavior of floating bodies (ships, submarines, tension-leg platforms etc.), the calculation of wave-making resistance in naval architecture, and the description of wave patterns over bottom topography in geophysical hydrodynamics. The first section deals with time-harmonic waves. Three linear boundary value problems serve as the approximate mathematical models for these types of water waves. The next section uses a plethora of mathematical techniques in the investigation of these three problems. The techniques used in the book include integral equations based on Green's functions, various inequalities between the kinetic and potential energy and integral identities which are indispensable for proving the uniqueness theorems. The so-called inverse procedure is applied to constructing examples of non-uniqueness, usually referred to as 'trapped nodes.'

  20. Using non-invasive molecular spectroscopic techniques to detect unique aspects of protein Amide functional groups and chemical properties of modeled forage from different sourced-origins

    NASA Astrophysics Data System (ADS)

    Ji, Cuiying; Zhang, Xuewei; Yu, Peiqiang

    2016-03-01

    The non-invasive molecular spectroscopic technique-FT/IR is capable to detect the molecular structure spectral features that are associated with biological, nutritional and biodegradation functions. However, to date, few researches have been conducted to use these non-invasive molecular spectroscopic techniques to study forage internal protein structures associated with biodegradation and biological functions. The objectives of this study were to detect unique aspects and association of protein Amide functional groups in terms of protein Amide I and II spectral profiles and chemical properties in the alfalfa forage (Medicago sativa L.) from different sourced-origins. In this study, alfalfa hay with two different origins was used as modeled forage for molecular structure and chemical property study. In each forage origin, five to seven sources were analyzed. The molecular spectral profiles were determined using FT/IR non-invasive molecular spectroscopy. The parameters of protein spectral profiles included functional groups of Amide I, Amide II and Amide I to II ratio. The results show that the modeled forage Amide I and Amide II were centered at 1653 cm- 1 and 1545 cm- 1, respectively. The Amide I spectral height and area intensities were from 0.02 to 0.03 and 2.67 to 3.36 AI, respectively. The Amide II spectral height and area intensities were from 0.01 to 0.02 and 0.71 to 0.93 AI, respectively. The Amide I to II spectral peak height and area ratios were from 1.86 to 1.88 and 3.68 to 3.79, respectively. Our results show that the non-invasive molecular spectroscopic techniques are capable to detect forage internal protein structure features which are associated with forage chemical properties.

  1. Global stability of a multiple infected compartments model for waterborne diseases

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Cao, Jinde

    2014-10-01

    In this paper, mathematical analysis is carried out for a multiple infected compartments model for waterborne diseases, such as cholera, giardia, and rotavirus. The model accounts for both person-to-person and water-to-person transmission routes. Global stability of the equilibria is studied. In terms of the basic reproduction number R0, we prove that, if R0⩽1, then the disease-free equilibrium is globally asymptotically stable and the infection always disappears; whereas if R0>1, there exists a unique endemic equilibrium which is globally asymptotically stable for the corresponding fast-slow system. Numerical simulations verify our theoretical results and present that the decay rate of waterborne pathogens has a significant impact on the epidemic growth rate. Also, we observe numerically that the unique endemic equilibrium is globally asymptotically stable for the whole system. This statement indicates that the present method need to be improved by other techniques.

  2. Technology for pressure-instrumented thin airfoil models

    NASA Technical Reports Server (NTRS)

    Wigley, David A.

    1988-01-01

    A novel method of airfoil model construction was developed. This Laminated Sheet technique uses 0.8 mm thick sheets of A286 containing a network of pre-formed channels which are vacuum brazed together to form the airfoil. A 6.25 percent model of the X29A canard, which has a 5 percent thick section, was built using this technique. The model contained a total of 96 pressure orifices, 56 in three chordwise rows on the upper surface and 37 in three similar rows on the lower surface. It was tested in the NASA Langley 0.3 m Transonic Cryogenic Tunnel. Unique aerodynamic data was obtained over the full range of temperature and pressure. Part of the data was at transonic Mach numbers and flight Reynolds number. A larger two dimensional model of the NACA 64a-105 airfoil section was also fabricated. Scale up presented some problems, but a testable airfoil was fabricated.

  3. A comparative study of scramjet injection strategies for high Mach numbers flows

    NASA Technical Reports Server (NTRS)

    Riggins, D. W.; Mcclinton, C. R.; Rogers, R. C.; Bittner, R. D.

    1992-01-01

    A simple method for predicting the axial distribution of supersonic combustor thrust potential is described. A complementary technique for illustrating the spatial evolution and distribution of thrust potential and loss mechanisms in reacting flows is developed. Wall jet cases and swept ramp injector cases for Mach 17 and Mach 13.5 flight enthalpy inflow conditions are numerically modeled and analyzed using these techniques. The visualization of thrust potential in the combustor for the various cases examined provides a unique tool for increasing understanding of supersonic combustor performance potential.

  4. The recovery and utilization of space suit range-of-motion data

    NASA Technical Reports Server (NTRS)

    Reinhardt, AL; Walton, James S.

    1988-01-01

    A technique for recovering data for the range of motion of a subject wearing a space suit is described along with the validation of this technique on an EVA space suit. Digitized data are automatically acquired from video images of the subject; three-dimensional trajectories are recovered from these data, and can be displayed using three-dimensional computer graphics. Target locations are recovered using a unique video processor and close-range photogrammetry. It is concluded that such data can be used in such applications as the animation of anthropometric computer models.

  5. A SINDA thermal model using CAD/CAE technologies

    NASA Technical Reports Server (NTRS)

    Rodriguez, Jose A.; Spencer, Steve

    1992-01-01

    The approach to thermal analysis described by this paper is a technique that incorporates Computer Aided Design (CAD) and Computer Aided Engineering (CAE) to develop a thermal model that has the advantages of Finite Element Methods (FEM) without abandoning the unique advantages of Finite Difference Methods (FDM) in the analysis of thermal systems. The incorporation of existing CAD geometry, the powerful use of a pre and post processor and the ability to do interdisciplinary analysis, will be described.

  6. High-energy synchrotron x-ray techniques for studying irradiated materials

    DOE PAGES

    Park, Jun-Sang; Zhang, Xuan; Sharma, Hemant; ...

    2015-03-20

    High performance materials that can withstand radiation, heat, multiaxial stresses, and corrosive environment are necessary for the deployment of advanced nuclear energy systems. Nondestructive in situ experimental techniques utilizing high energy x-rays from synchrotron sources can be an attractive set of tools for engineers and scientists to investigate the structure–processing–property relationship systematically at smaller length scales and help build better material models. In this paper, two unique and interconnected experimental techniques, namely, simultaneous small-angle/wide-angle x-ray scattering (SAXS/WAXS) and far-field high-energy diffraction microscopy (FF-HEDM) are presented. Finally, the changes in material state as Fe-based alloys are heated to high temperatures ormore » subject to irradiation are examined using these techniques.« less

  7. MODEST: A Tool for Geodesy and Astronomy

    NASA Technical Reports Server (NTRS)

    Sovers, Ojars J.; Jacobs, Christopher S.; Lanyi, Gabor E.

    2004-01-01

    Features of the JPL VLBI modeling and estimation software "MODEST" are reviewed. Its main advantages include thoroughly documented model physics, portability, and detailed error modeling. Two unique models are included: modeling of source structure and modeling of both spatial and temporal correlations in tropospheric delay noise. History of the code parallels the development of the astrometric and geodetic VLBI technique and the software retains many of the models implemented during its advancement. The code has been traceably maintained since the early 1980s, and will continue to be updated with recent IERS standards. Scripts are being developed to facilitate user-friendly data processing in the era of e-VLBI.

  8. Periodicity computation of generalized mathematical biology problems involving delay differential equations.

    PubMed

    Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z

    2017-03-01

    In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.

  9. Identification of propulsion systems

    NASA Technical Reports Server (NTRS)

    Merrill, Walter; Guo, Ten-Huei; Duyar, Ahmet

    1991-01-01

    This paper presents a tutorial on the use of model identification techniques for the identification of propulsion system models. These models are important for control design, simulation, parameter estimation, and fault detection. Propulsion system identification is defined in the context of the classical description of identification as a four step process that is unique because of special considerations of data and error sources. Propulsion system models are described along with the dependence of system operation on the environment. Propulsion system simulation approaches are discussed as well as approaches to propulsion system identification with examples for both air breathing and rocket systems.

  10. Modelling a single phase voltage controlled rectifier using Laplace transforms

    NASA Technical Reports Server (NTRS)

    Kraft, L. Alan; Kankam, M. David

    1992-01-01

    The development of a 20 kHz, AC power system by NASA for large space projects has spurred a need to develop models for the equipment which will be used on these single phase systems. To date, models for the AC source (i.e., inverters) have been developed. It is the intent of this paper to develop a method to model the single phase voltage controlled rectifiers which will be attached to the AC power grid as an interface for connected loads. A modified version of EPRI's HARMFLO program is used as the shell for these models. The results obtained from the model developed in this paper are quite adequate for the analysis of problems such as voltage resonance. The unique technique presented in this paper uses the Laplace transforms to determine the harmonic content of the load current of the rectifier rather than a curve fitting technique. Laplace transforms yield the coefficient of the differential equations which model the line current to the rectifier directly.

  11. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  12. Applications of Tol2 Transposon-Mediated Gene Transfer for Stable Integration and Conditional Expression of Electroporated Genes in Chicken Embryos

    NASA Astrophysics Data System (ADS)

    Sato, Yuki; Takahashi, Yoshiko

    Because of the high accessibility to developing embryos, avian embryos (chicken and quail) have long been used as a good model animal to study embryogenesis in vertebrates, especially amniotes (reviewed in Wolpert, 2004). The techniques used for “classical” avian embryology included tissue transplantations, tissue ablations, and cell-labeling by vital dye. At the end of the last century, the in ovo electropora tion technique was developed by Nakamura and his colleagues, and this modern method opened a way to study the roles of developmental genes directly in living embryos (Funahashi et al., 1999) reviewed in (Nakamura et al., 2004; Yasuda et al., 2000; Yasugi and Nakamura, 2000). This powerful technique allows us to introduce genes (DNA, RNA, morpholino) into embryos in a tissue-specific way by targeting a restricted area of embryonic tissues. Thus, the electroporation technique using chickens has provided numerous novel insights into the understanding of early development in vertebrates, making the chicken a unique model animal.

  13. Real-time physiological monitoring with distributed networks of sensors and object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.

    1998-05-01

    Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.

  14. Integrating Sand Tray and Solution Focused Brief Counseling as a Model for Working with Middle School Students

    ERIC Educational Resources Information Center

    McBrayer, Rachel H.; Chibbaro, Julia S.

    2012-01-01

    School counselors are master jugglers and must assume a variety of roles and tasks in order to be successful. Despite common misconceptions, Play Therapy is not for exclusive use with younger children. In fact, adolescents can also benefit from its unique properties. One integrated technique that could prove to be especially helpful with middle…

  15. Theoretical investigation of metal magnetic memory testing technique for detection of magnetic flux leakage signals from buried defect

    NASA Astrophysics Data System (ADS)

    Xu, Kunshan; Qiu, Xingqi; Tian, Xiaoshuai

    2018-01-01

    The metal magnetic memory testing (MMMT) technique has been extensively applied in various fields because of its unique advantages of easy operation, low cost and high efficiency. However, very limited theoretical research has been conducted on application of MMMT to buried defects. To promote study in this area, the equivalent magnetic charge method is employed to establish a self-magnetic flux leakage (SMFL) model of a buried defect. Theoretical results based on the established model successfully capture basic characteristics of the SMFL signals of buried defects, as confirmed via experiment. In particular, the newly developed model can calculate the buried depth of a defect based on the SMFL signals obtained via testing. The results show that the new model can successfully assess the characteristics of buried defects, which is valuable in the application of MMMT in non-destructive testing.

  16. The influence of and the identification of nonlinearity in flexible structures

    NASA Technical Reports Server (NTRS)

    Zavodney, Lawrence D.

    1988-01-01

    Several models were built at NASA Langley and used to demonstrate the following nonlinear behavior: internal resonance in a free response, principal parametric resonance and subcritical instability in a cantilever beam-lumped mass structure, combination resonance in a parametrically excited flexible beam, autoparametric interaction in a two-degree-of-freedom system, instability of the linear solution, saturation of the excited mode, subharmonic bifurcation, and chaotic responses. A video tape documenting these phenomena was made. An attempt to identify a simple structure consisting of two light-weight beams and two lumped masses using the Eigensystem Realization Algorithm showed the inherent difficulty of using a linear based theory to identify a particular nonlinearity. Preliminary results show the technique requires novel interpretation, and hence may not be useful for structural modes that are coupled by a guadratic nonlinearity. A literature survey was also completed on recent work in parametrically excited nonlinear system. In summary, nonlinear systems may possess unique behaviors that require nonlinear identification techniques based on an understanding of how nonlinearity affects the dynamic response of structures. In this was, the unique behaviors of nonlinear systems may be properly identified. Moreover, more accutate quantifiable estimates can be made once the qualitative model has been determined.

  17. Finite element model correlation of a composite UAV wing using modal frequencies

    NASA Astrophysics Data System (ADS)

    Oliver, Joseph A.; Kosmatka, John B.; Hemez, François M.; Farrar, Charles R.

    2007-04-01

    The current work details the implementation of a meta-model based correlation technique on a composite UAV wing test piece and associated finite element (FE) model. This method involves training polynomial models to emulate the FE input-output behavior and then using numerical optimization to produce a set of correlated parameters which can be returned to the FE model. After discussions about the practical implementation, the technique is validated on a composite plate structure and then applied to the UAV wing structure, where it is furthermore compared to a more traditional Newton-Raphson technique which iteratively uses first-order Taylor-series sensitivity. The experimental testpiece wing comprises two graphite/epoxy prepreg and Nomex honeycomb co-cured skins and two prepreg spars bonded together in a secondary process. MSC.Nastran FE models of the four structural components are correlated independently, using modal frequencies as correlation features, before being joined together into the assembled structure and compared to experimentally measured frequencies from the assembled wing in a cantilever configuration. Results show that significant improvements can be made to the assembled model fidelity, with the meta-model procedure producing slightly superior results to Newton-Raphson iteration. Final evaluation of component correlation using the assembled wing comparison showed worse results for each correlation technique, with the meta-model technique worse overall. This can be most likely be attributed to difficultly in correlating the open-section spars; however, there is also some question about non-unique update variable combinations in the current configuration, which lead correlation away from physically probably values.

  18. Dual nozzle aerodynamic and cooling analysis study

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1981-01-01

    Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.

  19. Constructing Cross-Linked Polymer Networks Using Monte Carlo Simulated Annealing Technique for Atomistic Molecular Simulations

    DTIC Science & Technology

    2014-10-01

    the angles and dihedrals that are truly unique will be indicated by the user by editing NewAngleTypesDump and NewDihedralTypesDump. The program ...Atomistic Molecular Simulations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Robert M Elder, Timothy W Sirk, and...Antechamber program in Assisted Model Building with Energy Refinement (AMBER) Tools to assign partial charges (using the Austin Model 1 [AM1]-bond charge

  20. Measurement of Aqueous Foam Rheology by Acoustic Levitation

    NASA Technical Reports Server (NTRS)

    McDaniel, J. Gregory; Holt, R. Glynn; Rogers, Rich (Technical Monitor)

    2000-01-01

    An experimental technique is demonstrated for acoustically levitating aqueous foam drops and exciting their spheroidal modes. This allows fundamental studies of foam-drop dynamics that provide an alternative means of estimating the viscoelastic properties of the foam. One unique advantage of the technique is the lack of interactions between the foam and container surfaces, which must be accounted for in other techniques. Results are presented in which a foam drop with gas volume fraction phi = 0.77 is levitated at 30 kHz and excited into its first quadrupole resonance at 63 +/- 3 Hz. By modeling the drop as an elastic sphere, the shear modulus of the foam was estimated at 75 +/- 3 Pa.

  1. Multi-shot PROPELLER for high-field preclinical MRI

    PubMed Central

    Pandit, Prachi; Qi, Yi; Story, Jennifer; King, Kevin F.; Johnson, G. Allan

    2012-01-01

    With the development of numerous mouse models of cancer, there is a tremendous need for an appropriate imaging technique to study the disease evolution. High-field T2-weighted imaging using PROPELLER MRI meets this need. The 2-shot PROPELLER technique presented here, provides (a) high spatial resolution, (b) high contrast resolution, and (c) rapid and non-invasive imaging, which enables high-throughput, longitudinal studies in free-breathing mice. Unique data collection and reconstruction makes this method robust against motion artifacts. The 2-shot modification introduced here, retains more high-frequency information and provides higher SNR than conventional single-shot PROPELLER, making this sequence feasible at high-fields, where signal loss is rapid. Results are shown in a liver metastases model to demonstrate the utility of this technique in one of the more challenging regions of the mouse, which is the abdomen. PMID:20572138

  2. Multishot PROPELLER for high-field preclinical MRI.

    PubMed

    Pandit, Prachi; Qi, Yi; Story, Jennifer; King, Kevin F; Johnson, G Allan

    2010-07-01

    With the development of numerous mouse models of cancer, there is a tremendous need for an appropriate imaging technique to study the disease evolution. High-field T(2)-weighted imaging using PROPELLER (Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction) MRI meets this need. The two-shot PROPELLER technique presented here provides (a) high spatial resolution, (b) high contrast resolution, and (c) rapid and noninvasive imaging, which enables high-throughput, longitudinal studies in free-breathing mice. Unique data collection and reconstruction makes this method robust against motion artifacts. The two-shot modification introduced here retains more high-frequency information and provides higher signal-to-noise ratio than conventional single-shot PROPELLER, making this sequence feasible at high fields, where signal loss is rapid. Results are shown in a liver metastases model to demonstrate the utility of this technique in one of the more challenging regions of the mouse, which is the abdomen. (c) 2010 Wiley-Liss, Inc.

  3. System Identification for the Clipper Liberty C96 Wind Turbine

    NASA Astrophysics Data System (ADS)

    Showers, Daniel

    System identification techniques are powerful tools that help improve modeling capabilities of real world dynamic systems. These techniques are well established and have been successfully used on countless systems in many areas. However, wind turbines provide a unique challenge for system identification because of the difficulty in measuring its primary input: wind. This thesis first motivates the problem by demonstrating the challenges with wind turbine system identification using both simulations and real data. It then suggests techniques toward successfully identifying a dynamic wind turbine model including the notion of an effective wind speed and how it might be measured. Various levels of simulation complexity are explored for insights into calculating an effective wind speed. In addition, measurements taken from the University of Minnesota's Clipper Liberty C96 research wind turbine are used for a preliminary investigation into the effective wind speed calculation and system identification of a real world wind turbine.

  4. IEEE 1988 International Symposium on Electromagnetic Compatibility, Seattle, WA, Aug. 2-4, 1988, Record

    NASA Astrophysics Data System (ADS)

    Various papers on electromagnetic compatibility are presented. Some of the optics considered include: field-to-wire coupling 1 to 18 GHz, SHF/EHF field-to-wire coupling model, numerical method for the analysis of coupling to thin wire structures, spread-spectrum system with an adaptive array for combating interference, technique to select the optimum modulation indices for suppression of undesired signals for simultaneous range and data operations, development of a MHz RF leak detector technique for aircraft harness surveillance, and performance of standard aperture shielding techniques at microwave frequncies. Also discussed are: spectrum efficiency of spread-spectrum systems, control of power supply ripple produced sidebands in microwave transistor amplifiers, an intership SATCOM versus radar electromagnetic interference prediction model, considerations in the design of a broadband E-field sensing system, unique bonding methods for spacecraft, and review of EMC practice for launch vehicle systems.

  5. A technique for measuring the quality of an elliptically bent pentaerythritol [PET(002)] crystal

    DOE PAGES

    Haugh, M. J.; Jacoby, K. D.; Barrios, M. A.; ...

    2016-08-23

    Here, we present a technique for determining the X-ray spectral quality from each region of an elliptically curved PET(002) crystal. The investigative technique utilizes the shape of the crystal rocking curve which changes significantly as the radius of curvature changes. This unique quality information enables the spectroscopist to verify where in the spectral range that the spectrometer performance is satisfactory and where there are regions that would show spectral distortion. A collection of rocking curve measurements for elliptically curved PET(002) has been built up in our X-ray laboratory. The multi-lamellar model from the XOP software has been used as amore » guide and corrections were applied to the model based upon measurements. But, the measurement of RI at small radius of curvature shows an anomalous behavior; the multi-lamellar model fails to show this behavior. The effect of this anomalous RI behavior on an X-ray spectrometer calibration is calculated. It is compared to the multi-lamellar model calculation which is completely inadequate for predicting RI for this range of curvature and spectral energies.« less

  6. A technique for measuring the quality of an elliptically bent pentaerythritol [PET(002)] crystal

    NASA Astrophysics Data System (ADS)

    Haugh, M. J.; Jacoby, K. D.; Barrios, M. A.; Thorn, D.; Emig, J. A.; Schneider, M. B.

    2016-11-01

    We present a technique for determining the X-ray spectral quality from each region of an elliptically curved PET(002) crystal. The investigative technique utilizes the shape of the crystal rocking curve which changes significantly as the radius of curvature changes. This unique quality information enables the spectroscopist to verify where in the spectral range that the spectrometer performance is satisfactory and where there are regions that would show spectral distortion. A collection of rocking curve measurements for elliptically curved PET(002) has been built up in our X-ray laboratory. The multi-lamellar model from the XOP software has been used as a guide and corrections were applied to the model based upon measurements. But, the measurement of RI at small radius of curvature shows an anomalous behavior; the multi-lamellar model fails to show this behavior. The effect of this anomalous RI behavior on an X-ray spectrometer calibration is calculated. It is compared to the multi-lamellar model calculation which is completely inadequate for predicting RI for this range of curvature and spectral energies.

  7. Thermo-physical performance prediction of the KSC Ground Operation Demonstration Unit for liquid hydrogen

    NASA Astrophysics Data System (ADS)

    Baik, J. H.; Notardonato, W. U.; Karng, S. W.; Oh, I.

    2015-12-01

    NASA Kennedy Space Center (KSC) researchers have been working on enhanced and modernized cryogenic liquid propellant handling techniques to reduce life cycle costs of propellant management system for the unique KSC application. The KSC Ground Operation Demonstration Unit (GODU) for liquid hydrogen (LH2) plans to demonstrate integrated refrigeration, zero-loss flexible term storage of LH2, and densified hydrogen handling techniques. The Florida Solar Energy Center (FSEC) has partnered with the KSC researchers to develop thermal performance prediction model of the GODU for LH2. The model includes integrated refrigeration cooling performance, thermal losses in the tank and distribution lines, transient system characteristics during chilling and loading, and long term steady-state propellant storage. This paper will discuss recent experimental data of the GODU for LH2 system and modeling results.

  8. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  9. Personalized models of bones based on radiographic photogrammetry.

    PubMed

    Berthonnaud, E; Hilmi, R; Dimnet, J

    2009-07-01

    The radiographic photogrammetry is applied, for locating anatomical landmarks in space, from their two projected images. The goal of this paper is to define a personalized geometric model of bones, based uniquely on photogrammetric reconstructions. The personalized models of bones are obtained from two successive steps: their functional frameworks are first determined experimentally, then, the 3D bone representation results from modeling techniques. Each bone functional framework is issued from direct measurements upon two radiographic images. These images may be obtained using either perpendicular (spine and sacrum) or oblique incidences (pelvis and lower limb). Frameworks link together their functional axes and punctual landmarks. Each global bone volume is decomposed in several elementary components. Each volumic component is represented by simple geometric shapes. Volumic shapes are articulated to the patient's bone structure. The volumic personalization is obtained by best fitting the geometric model projections to their real images, using adjustable articulations. Examples are presented to illustrating the technique of personalization of bone volumes, directly issued from the treatment of only two radiographic images. The chosen techniques for treating data are then discussed. The 3D representation of bones completes, for clinical users, the information brought by radiographic images.

  10. Alpha-canonical form representation of the open loop dynamics of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Duyar, Almet; Eldem, Vasfi; Merrill, Walter C.; Guo, Ten-Huei

    1991-01-01

    A parameter and structure estimation technique for multivariable systems is used to obtain a state space representation of open loop dynamics of the space shuttle main engine in alpha-canonical form. The parameterization being used is both minimal and unique. The simplified linear model may be used for fault detection studies and control system design and development.

  11. Three-Hand Endoscopic Endonasal Transsphenoidal Surgery: Experience With an Anatomy-Preserving Mononostril Approach Technique.

    PubMed

    Eseonu, Chikezie I; ReFaey, Karim; Pamias-Portalatin, Eva; Asensio, Javier; Garcia, Oscar; Boahene, Kofi D; Quiñones-Hinojosa, Alfredo

    2018-02-01

    Variations on the endoscopic transsphenoidal approach present unique surgical techniques that have unique effects on surgical outcomes, extent of resection (EOR), and anatomical complications. To analyze the learning curve and perioperative outcomes of the 3-hand endoscopic endonasal mononostril transsphenoidal technique. Prospective case series and retrospective data analysis of patients who were treated with the 3-hand transsphenoidal technique between January 2007 and May 2015 by a single neurosurgeon. Patient characteristics, preoperative presentation, tumor characteristics, operative times, learning curve, and postoperative outcomes were analyzed. Volumetric EOR was evaluated, and a logistic regression analysis was used to assess predictors of EOR. Two hundred seventy-five patients underwent an endoscopic transsphenoidal surgery using the 3-hand technique. One hundred eighteen patients in the early group had surgery between 2007 and 2010, while 157 patients in the late group had surgery between 2011 and 2015. Operative time was significantly shorter in the late group (161.6 min) compared to the early group (211.3 min, P = .001). Both cohorts had similar EOR (early group 84.6% vs late group 85.5%, P = .846) and postoperative outcomes. The learning curve showed that it took 54 cases to achieve operative proficiency with the 3-handed technique. Multivariate modeling suggested that prior resections and preoperative tumor size are important predictors for EOR. We describe a 3-hand, mononostril endoscopic transsphenoidal technique performed by a single neurosurgeon that has minimal anatomic distortion and postoperative complications. During the learning curve of this technique, operative time can significantly decrease, while EOR, postoperative outcomes, and complications are not jeopardized. Copyright © 2017 by the Congress of Neurological Surgeons

  12. A case management agency and bank create a service innovation.

    PubMed

    Katz, K S; Stowe, A W

    1992-01-01

    Connecticut Community Care, Inc. (CCCI), a statewide, nonprofit case management agency, in collaboration with Connecticut National Bank (CNB), developed a unique model of delivering case management services to bank trust clients. No reports of such a collaborative model have been found in the published literature in the United States. The article presents a historical overview of this innovative initiative; the identification of the target population; the delivery of the assessment, coordination, and monitoring services; and the marketing techniques. Utilization statistics, a synopsis of the model outcomes as viewed by the trust officers, and suggestions for replication are also presented.

  13. Radiation-induced gene expression in the nematode Caenorhabditis elegans

    NASA Technical Reports Server (NTRS)

    Nelson, Gregory A.; Jones, Tamako A.; Chesnut, Aaron; Smith, Anna L.

    2002-01-01

    We used the nematode C. elegans to characterize the genotoxic and cytotoxic effects of ionizing radiation in a simple animal model emphasizing the unique effects of charged particle radiation. Here we demonstrate by RT-PCR differential display and whole genome microarray hybridization experiments that gamma rays, accelerated protons and iron ions at the same physical dose lead to unique transcription profiles. 599 of 17871 genes analyzed (3.4%) showed differential expression 3 hrs after exposure to 3 Gy of radiation. 193 were up-regulated, 406 were down-regulated and 90% were affected only by a single species of radiation. A novel statistical clustering technique identified the regulatory relationships between the radiation-modulated genes and showed that genes affected by each radiation species were associated with unique regulatory clusters. This suggests that independent homeostatic mechanisms are activated in response to radiation exposure as a function of track structure or ionization density.

  14. In vivo porcine training model for cranial neurosurgery.

    PubMed

    Regelsberger, Jan; Eicker, Sven; Siasios, Ioannis; Hänggi, Daniel; Kirsch, Matthias; Horn, Peter; Winkler, Peter; Signoretti, Stefano; Fountas, Kostas; Dufour, Henry; Barcia, Juan A; Sakowitz, Oliver; Westermaier, Thomas; Sabel, Michael; Heese, Oliver

    2015-01-01

    Supplemental education is desirable for neurosurgical training, and the use of human cadaver specimen and virtual reality models is routine. An in vivo porcine training model for cranial neurosurgery was introduced in 2005, and our recent experience with this unique model is outlined here. For the first time, porcine anatomy is illustrated with particular respect to neurosurgical procedures. The pros and cons of this model are described. The aim of the course was to set up a laboratory scenery imitating an almost realistic operating room in which anatomy of the brain and neurosurgical techniques in a mentored environment free from time constraints could be trained. Learning objectives of the course were to learn about the microsurgical techniques in cranial neurosurgery and the management of complications. Participants were asked to evaluate the quality and utility of the programme via standardized questionnaires by a grading scale from A (best) to E (worst). In total, 154 residents have been trained on the porcine model to date. None of the participants regarded his own residency programme as structured. The bleeding and complication management (97%), the realistic laboratory set-up (89%) and the working environment (94%) were favoured by the vast majority of trainees and confirmed our previous findings. After finishing the course, the participants graded that their skills in bone drilling, dissecting the brain and preserving cerebral vessels under microscopic magnification had improved to level A and B. In vivo hands-on courses, fully equipped with microsurgical instruments, offer an outstanding training opportunity in which bleeding management on a pulsating, vital brain represents a unique training approach. Our results have shown that education programmes still lack practical training facilities in which in vivo models may act as a complementary approach in surgical training.

  15. Using non-invasive molecular spectroscopic techniques to detect unique aspects of protein Amide functional groups and chemical properties of modeled forage from different sourced-origins.

    PubMed

    Ji, Cuiying; Zhang, Xuewei; Yu, Peiqiang

    2016-03-05

    The non-invasive molecular spectroscopic technique-FT/IR is capable to detect the molecular structure spectral features that are associated with biological, nutritional and biodegradation functions. However, to date, few researches have been conducted to use these non-invasive molecular spectroscopic techniques to study forage internal protein structures associated with biodegradation and biological functions. The objectives of this study were to detect unique aspects and association of protein Amide functional groups in terms of protein Amide I and II spectral profiles and chemical properties in the alfalfa forage (Medicago sativa L.) from different sourced-origins. In this study, alfalfa hay with two different origins was used as modeled forage for molecular structure and chemical property study. In each forage origin, five to seven sources were analyzed. The molecular spectral profiles were determined using FT/IR non-invasive molecular spectroscopy. The parameters of protein spectral profiles included functional groups of Amide I, Amide II and Amide I to II ratio. The results show that the modeled forage Amide I and Amide II were centered at 1653 cm(-1) and 1545 cm(-1), respectively. The Amide I spectral height and area intensities were from 0.02 to 0.03 and 2.67 to 3.36 AI, respectively. The Amide II spectral height and area intensities were from 0.01 to 0.02 and 0.71 to 0.93 AI, respectively. The Amide I to II spectral peak height and area ratios were from 1.86 to 1.88 and 3.68 to 3.79, respectively. Our results show that the non-invasive molecular spectroscopic techniques are capable to detect forage internal protein structure features which are associated with forage chemical properties. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  17. Time-lapse cinematography in living Drosophila tissues: preparation of material.

    PubMed

    Davis, Ilan; Parton, Richard M

    2006-11-01

    The fruit fly, Drosophila melanogaster, has been an extraordinarily successful model organism for studying the genetic basis of development and evolution. It is arguably the best-understood complex multicellular model system, owing its success to many factors. Recent developments in imaging techniques, in particular sophisticated fluorescence microscopy methods and equipment, now allow cellular events to be studied at high resolution in living material. This ability has enabled the study of features that tend to be lost or damaged by fixation, such as transient or dynamic events. Although many of the techniques of live cell imaging in Drosophila are shared with the greater community of cell biologists working on other model systems, studying living fly tissues presents unique difficulties in keeping the cells alive, introducing fluorescent probes, and imaging through thick hazy cytoplasm. This protocol outlines the preparation of major tissue types amenable to study by time-lapse cinematography and different methods for keeping them alive.

  18. Separation of crack extension modes in orthotropic delamination models

    NASA Technical Reports Server (NTRS)

    Beuth, Jack L.

    1995-01-01

    In the analysis of an interface crack between dissimilar elastic materials, the mode of crack extension is typically not unique, due to oscillatory behavior of near-tip stresses and displacements. This behavior currently limits the applicability of interfacial fracture mechanics as a means to predict composite delamination. The Virtual Crack Closure Technique (VCCT) is a method used to extract mode 1 and mode 2 energy release rates from numerical fracture solutions. The mode of crack extension extracted from an oscillatory solution using the VCCT is not unique due to the dependence of mode on the virtual crack extension length, Delta. In this work, a method is presented for using the VCCT to extract Delta-independent crack extension modes for the case of an interface crack between two in-plane orthotropic materials. The method does not involve altering the analysis to eliminate its oscillatory behavior. Instead, it is argued that physically reasonable, Delta-independent modes of crack extension can be extracted from oscillatory solutions. Knowledge of near-tip fields is used to determine the explicit Delta dependence of energy release rate parameters. Energy release rates are then defined that are separated from the oscillatory dependence on Delta. A modified VCCT using these energy release rate definitions is applied to results from finite element analyses, showing that Delta-independent modes of crack extension result. The modified technique has potential as a consistent method for extracting crack extension modes from numerical solutions. The Delta-independent modes extracted using this technique can also serve as guides for testing the convergence of finite element models. Direct applications of this work include the analysis of planar composite delamination problems, where plies or debonded laminates are modeled as in-plane orthotropic materials.

  19. New spectral imaging techniques for blood oximetry in the retina

    NASA Astrophysics Data System (ADS)

    Alabboud, Ied; Muyo, Gonzalo; Gorman, Alistair; Mordant, David; McNaught, Andrew; Petres, Clement; Petillot, Yvan R.; Harvey, Andrew R.

    2007-07-01

    Hyperspectral imaging of the retina presents a unique opportunity for direct and quantitative mapping of retinal biochemistry - particularly of the vasculature where blood oximetry is enabled by the strong variation of absorption spectra with oxygenation. This is particularly pertinent both to research and to clinical investigation and diagnosis of retinal diseases such as diabetes, glaucoma and age-related macular degeneration. The optimal exploitation of hyperspectral imaging however, presents a set of challenging problems, including; the poorly characterised and controlled optical environment of structures within the retina to be imaged; the erratic motion of the eye ball; and the compounding effects of the optical sensitivity of the retina and the low numerical aperture of the eye. We have developed two spectral imaging techniques to address these issues. We describe first a system in which a liquid crystal tuneable filter is integrated into the illumination system of a conventional fundus camera to enable time-sequential, random access recording of narrow-band spectral images. Image processing techniques are described to eradicate the artefacts that may be introduced by time-sequential imaging. In addition we describe a unique snapshot spectral imaging technique dubbed IRIS that employs polarising interferometry and Wollaston prism beam splitters to simultaneously replicate and spectrally filter images of the retina into multiple spectral bands onto a single detector array. Results of early clinical trials acquired with these two techniques together with a physical model which enables oximetry map are reported.

  20. EXPERIMENTAL MODELLING OF AORTIC ANEURYSMS

    PubMed Central

    Doyle, Barry J; Corbett, Timothy J; Cloonan, Aidan J; O’Donnell, Michael R; Walsh, Michael T; Vorp, David A; McGloughlin, Timothy M

    2009-01-01

    A range of silicone rubbers were created based on existing commercially available materials. These silicones were designed to be visually different from one another and have distinct material properties, in particular, ultimate tensile strengths and tear strengths. In total, eleven silicone rubbers were manufactured, with the materials designed to have a range of increasing tensile strengths from approximately 2-4MPa, and increasing tear strengths from approximately 0.45-0.7N/mm. The variations in silicones were detected using a standard colour analysis technique. Calibration curves were then created relating colour intensity to individual material properties. All eleven materials were characterised and a 1st order Ogden strain energy function applied. Material coefficients were determined and examined for effectiveness. Six idealised abdominal aortic aneurysm models were also created using the two base materials of the study, with a further model created using a new mixing technique to create a rubber model with randomly assigned material properties. These models were then examined using videoextensometry and compared to numerical results. Colour analysis revealed a statistically significant linear relationship (p<0.0009) with both tensile strength and tear strength, allowing material strength to be determined using a non-destructive experimental technique. The effectiveness of this technique was assessed by comparing predicted material properties to experimentally measured methods, with good agreement in the results. Videoextensometry and numerical modelling revealed minor percentage differences, with all results achieving significance (p<0.0009). This study has successfully designed and developed a range of silicone rubbers that have unique colour intensities and material strengths. Strengths can be readily determined using a non-destructive analysis technique with proven effectiveness. These silicones may further aid towards an improved understanding of the biomechanical behaviour of aneurysms using experimental techniques. PMID:19595622

  1. A heterogenous Cournot duopoly with delay dynamics: Hopf bifurcations and stability switching curves

    NASA Astrophysics Data System (ADS)

    Pecora, Nicolò; Sodini, Mauro

    2018-05-01

    This article considers a Cournot duopoly model in a continuous-time framework and analyze its dynamic behavior when the competitors are heterogeneous in determining their output decision. Specifically the model is expressed in the form of differential equations with discrete delays. The stability conditions of the unique Nash equilibrium of the system are determined and the emergence of Hopf bifurcations is shown. Applying some recent mathematical techniques (stability switching curves) and performing numerical simulations, the paper confirms how different time delays affect the stability of the economy.

  2. Constraining the Mechanism of D" Anisotropy: Diversity of Observation Types Required

    NASA Astrophysics Data System (ADS)

    Creasy, N.; Pisconti, A.; Long, M. D.; Thomas, C.

    2017-12-01

    A variety of different mechanisms have been proposed as explanations for seismic anisotropy at the base of the mantle, including crystallographic preferred orientation of various minerals (bridgmanite, post-perovskite, and ferropericlase) and shape preferred orientation of elastically distinct materials such as partial melt. Investigations of the mechanism for D" anisotropy are usually ambiguous, as seismic observations rarely (if ever) uniquely constrain a mechanism. Observations of shear wave splitting and polarities of SdS and PdP reflections off the D" discontinuity are among our best tools for probing D" anisotropy; however, typical data sets cannot constrain a unique scenario suggested by the mineral physics literature. In this work, we determine what types of body wave observations are required to uniquely constrain a mechanism for D" anisotropy. We test multiple possible models based on both single-crystal and poly-phase elastic tensors provided by mineral physics studies. We predict shear wave splitting parameters for SKS, SKKS, and ScS phases and reflection polarities off the D" interface for a range of possible propagation directions. We run a series of tests that create synthetic data sets by random selection over multiple iterations, controlling the total number of measurements, the azimuthal distribution, and the type of phases. We treat each randomly drawn synthetic dataset with the same methodology as in Ford et al. (2015) to determine the possible mechanism(s), carrying out a grid search over all possible elastic tensors and orientations to determine which are consistent with the synthetic data. We find is it difficult to uniquely constrain the starting model with a realistic number of seismic anisotropy measurements with only one measurement technique or phase type. However, having a mix of SKS, SKKS, and ScS measurements, or a mix of shear wave splitting and reflection polarity measurements, dramatically increases the probability of uniquely constraining the starting model. We also explore what types of datasets are needed to uniquely constrain the orientation(s) of anisotropic symmetry if the mechanism is assumed.

  3. A qualitative study of vortex trapping capability for lift enhancement on unconventional wing

    NASA Astrophysics Data System (ADS)

    Salleh, M. B.; Kamaruddin, N. M.; Mohamed-Kassim, Z.

    2018-05-01

    Lift enhancement by using passive vortex trapping technique offers great advantage in small aircraft design as it can improve aerodynamics performance and reduce weight of the wing. To achieve this aim, a qualitative study on the flow structures across wing models with cavities has been performed using smoke wire visualisation technique. An experiment has been conducted at low Reynolds number of 26,000 with angle of attack (α) = 0°, 5°, 10° and 15° to investigate the vortex trapping capability of semi-circular leading edge (SCLE) flat-plate wing model and elliptical leading edge (ELE) flat-plate wing model with cavities, respectively. Results from the qualitative study indicated unique characteristics in the flow structures between the tested wing models. The SCLE wing models were able to trap stable rotating vortices for α ≤ 10° whereas the ability of ELE wing models to suppress flow separation allowed stable clockwise vortices to be trapped inside the cavities even at α > 10°. The trapped vortices found to have the potential to increase lift on the unconventional wing models.

  4. Updates on measurements and modeling techniques for expendable countermeasures

    NASA Astrophysics Data System (ADS)

    Gignilliat, Robert; Tepfer, Kathleen; Wilson, Rebekah F.; Taczak, Thomas M.

    2016-10-01

    The potential threat of recently-advertised anti-ship missiles has instigated research at the United States (US) Naval Research Laboratory (NRL) into the improvement of measurement techniques for visual band countermeasures. The goal of measurements is the collection of radiometric imagery for use in the building and validation of digital models of expendable countermeasures. This paper will present an overview of measurement requirements unique to the visual band and differences between visual band and infrared (IR) band measurements. A review of the metrics used to characterize signatures in the visible band will be presented and contrasted to those commonly used in IR band measurements. For example, the visual band measurements require higher fidelity characterization of the background, including improved high-transmittance measurements and better characterization of solar conditions to correlate results more closely with changes in the environment. The range of relevant engagement angles has also been expanded to include higher altitude measurements of targets and countermeasures. In addition to the discussion of measurement techniques, a top-level qualitative summary of modeling approaches will be presented. No quantitative results or data will be presented.

  5. Unique and shared techniques in cognitive-behavioural and short-term psychodynamic psychotherapy: a content analysis of randomised trials on depression

    PubMed Central

    Barth, Jürgen; Michlig, Nadja; Munder, Thomas

    2014-01-01

    Randomised controlled trials (RCTs) of psychotherapeutic interventions assume that specific techniques are used in treatments, which are responsible for changes in the client's symptoms. This assumption also holds true for meta-analyses, where evidence for specific interventions and techniques is compiled. However, it has also been argued that different treatments share important techniques and that an upcoming consensus about useful treatment strategies is leading to a greater integration of treatments. This makes assumptions about the effectiveness of specific interventions ingredients questionable if the shared (common) techniques are more often used in interventions than are the unique techniques. This study investigated the unique or shared techniques in RCTs of cognitive-behavioural therapy (CBT) and short-term psychodynamic psychotherapy (STPP). Psychotherapeutic techniques were coded from 42 masked treatment descriptions of RCTs in the field of depression (1979–2010). CBT techniques were often used in studies identified as either CBT or STPP. However, STPP techniques were only used in STPP-identified studies. Empirical clustering of treatment descriptions did not confirm the original distinction of CBT versus STPP, but instead showed substantial heterogeneity within both approaches. Extraction of psychotherapeutic techniques from the treatment descriptions is feasible and could be used as a content-based approach to classify treatments in systematic reviews and meta-analyses. PMID:25750827

  6. Antigravity ESD - double-balloon-assisted underwater with traction hybrid technique.

    PubMed

    Sharma, Sam K; Hiratsuka, Takahiro; Hara, Hisashi; Milsom, Jeffrey W

    2018-06-01

     Complex colorectal polyps or those positioned in difficult anatomic locations are an endoscopic therapeutic challenge. Underwater endoscopic submucosal dissection (UESD) is a potential technical solution to facilitate efficient polyp removal. In addition, endoscopic tissue retraction has been confined to limited methods of varying efficacy and complexity. The aim of this study was to evaluate the efficiency of a unique UESD technique for removing complex polyps using double-balloon-assisted retraction (R).  Using fresh ex-vivo porcine rectum, 4-cm polyps were created using electrosurgery and positioned at "6 o'clock" within an established ESD model. Six resections were performed in each group. Underwater techniques were facilitated using a novel double-balloon platform (Dilumen, Lumendi, Westport, Connecticut, United States).  UESD-R had a significantly shorter total procedural time than cap-assisted ESD and UESD alone (24 vs. 58 vs. 56 mins). UESD-R produced a dissection time on average of 5 minutes, attributed to the retraction provided. There was also a subjective significant reduction in electrosurgical smoke with the underwater techniques contributing to improved visualization.  Here we report the first ex-vivo experience of a unique double-balloon endoscopic platform optimized for UESD with tissue traction capability. UESD-R removed complex lesions in significantly shorter time than conventional means. The combined benefits of UESD and retraction appeared to be additive when tackling complex polyps and should be studied further.

  7. A 13-week research-based biochemistry laboratory curriculum.

    PubMed

    Lefurgy, Scott T; Mundorff, Emily C

    2017-09-01

    Here, we present a 13-week research-based biochemistry laboratory curriculum designed to provide the students with the experience of engaging in original research while introducing foundational biochemistry laboratory techniques. The laboratory experience has been developed around the directed evolution of an enzyme chosen by the instructor, with mutations designed by the students. Ideal enzymes for this curriculum are able to be structurally modeled, solubly expressed, and monitored for activity by UV/Vis spectroscopy, and an example curriculum for haloalkane dehalogenase is given. Unique to this curriculum is a successful implementation of saturation mutagenesis and high-throughput screening of enzyme function, along with bioinformatics analysis, homology modeling, structural analysis, protein expression and purification, polyacrylamide gel electrophoresis, UV/Vis spectroscopy, and enzyme kinetics. Each of these techniques is carried out using a novel student-designed mutant library or enzyme variant unique to the lab team and, importantly, not described previously in the literature. Use of a well-established set of protocols promotes student data quality. Publication may result from the original student-generated hypotheses and data, either from the class as a whole or individual students that continue their independent projects upon course completion. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(5):437-448, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  8. Asphalt pavement aging and temperature dependent properties using functionally graded viscoelastic model

    NASA Astrophysics Data System (ADS)

    Dave, Eshan V.

    Asphalt concrete pavements are inherently graded viscoelastic structures. Oxidative aging of asphalt binder and temperature cycling due to climatic conditions being the major cause of non-homogeneity. Current pavement analysis and simulation procedures dwell on the use of layered approach to account for these non-homogeneities. The conventional finite-element modeling (FEM) technique discretizes the problem domain into smaller elements, each with a unique constitutive property. However the assignment of unique material property description to an element in the FEM approach makes it an unattractive choice for simulation of problems with material non-homogeneities. Specialized elements such as "graded elements" allow for non-homogenous material property definitions within an element. This dissertation describes the development of graded viscoelastic finite element analysis method and its application for analysis of asphalt concrete pavements. Results show that the present research improves efficiency and accuracy of simulations for asphalt pavement systems. Some of the practical implications of this work include the new technique's capability for accurate analysis and design of asphalt pavements and overlay systems and for the determination of pavement performance with varying climatic conditions and amount of in-service age. Other application areas include simulation of functionally graded fiber-reinforced concrete, geotechnical materials, metal and metal composites at high temperatures, polymers, and several other naturally existing and engineered materials.

  9. Nonlinear regression method for estimating neutral wind and temperature from Fabry-Perot interferometer data.

    PubMed

    Harding, Brian J; Gehrels, Thomas W; Makela, Jonathan J

    2014-02-01

    The Earth's thermosphere plays a critical role in driving electrodynamic processes in the ionosphere and in transferring solar energy to the atmosphere, yet measurements of thermospheric state parameters, such as wind and temperature, are sparse. One of the most popular techniques for measuring these parameters is to use a Fabry-Perot interferometer to monitor the Doppler width and breadth of naturally occurring airglow emissions in the thermosphere. In this work, we present a technique for estimating upper-atmospheric winds and temperatures from images of Fabry-Perot fringes captured by a CCD detector. We estimate instrument parameters from fringe patterns of a frequency-stabilized laser, and we use these parameters to estimate winds and temperatures from airglow fringe patterns. A unique feature of this technique is the model used for the laser and airglow fringe patterns, which fits all fringes simultaneously and attempts to model the effects of optical defects. This technique yields accurate estimates for winds, temperatures, and the associated uncertainties in these parameters, as we show with a Monte Carlo simulation.

  10. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    PubMed

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style in which the content is delivered.

  11. Efficient Privacy-Enhancing Techniques for Medical Databases

    NASA Astrophysics Data System (ADS)

    Schartner, Peter; Schaffer, Martin

    In this paper, we introduce an alternative for using linkable unique health identifiers: locally generated system-wide unique digital pseudonyms. The presented techniques are based on a novel technique called collision-free number generation which is discussed in the introductory part of the article. Afterwards, attention is payed onto two specific variants of collision-free number generation: one based on the RSA-Problem and the other one based on the Elliptic Curve Discrete Logarithm Problem. Finally, two applications are sketched: centralized medical records and anonymous medical databases.

  12. Multidomain proteins under force

    NASA Astrophysics Data System (ADS)

    Valle-Orero, Jessica; Andrés Rivas-Pardo, Jaime; Popa, Ionel

    2017-04-01

    Advancements in single-molecule force spectroscopy techniques such as atomic force microscopy and magnetic tweezers allow investigation of how domain folding under force can play a physiological role. Combining these techniques with protein engineering and HaloTag covalent attachment, we investigate similarities and differences between four model proteins: I10 and I91—two immunoglobulin-like domains from the muscle protein titin, and two α + β fold proteins—ubiquitin and protein L. These proteins show a different mechanical response and have unique extensions under force. Remarkably, when normalized to their contour length, the size of the unfolding and refolding steps as a function of force reduces to a single master curve. This curve can be described using standard models of polymer elasticity, explaining the entropic nature of the measured steps. We further validate our measurements with a simple energy landscape model, which combines protein folding with polymer physics and accounts for the complex nature of tandem domains under force. This model can become a useful tool to help in deciphering the complexity of multidomain proteins operating under force.

  13. Minimally invasive screening for colitis using attenuated total internal reflectance Fourier transform infrared spectroscopy

    PubMed Central

    Titus, Jitto; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil

    2016-01-01

    This article describes a rapid, simple and cost-effective technique that could lead to a screening method for colitis without the need for biopsies or in vivo measurements. This screening technique includes the testing of serum using Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy for the colitis-induced increased presence of mannose. Chronic (Interleukin 10 knockout) and acute (Dextran Sodium Sulphate-induced) models for colitis are tested using the ATR-FTIR technique. Arthritis (Collagen Antibody Induced Arthritis) and metabolic syndrome (Toll like receptor 5 knockout) models are also tested as controls. The marker identified as mannose uniquely screens and distinguishes the colitic from the non-colitic samples and the controls. The reference or the baseline spectrum could be the pooled and averaged spectra of non-colitic samples or the subject's previous sample spectrum. This shows the potential of having individualized route maps of disease status, leading to personalized diagnosis and drug management. PMID:27094092

  14. Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation

    NASA Astrophysics Data System (ADS)

    Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.

  15. Unique effects of setting goals on behavior change: Systematic review and meta-analysis.

    PubMed

    Epton, Tracy; Currie, Sinead; Armitage, Christopher J

    2017-12-01

    Goal setting is a common feature of behavior change interventions, but it is unclear when goal setting is optimally effective. The aims of this systematic review and meta-analysis were to evaluate: (a) the unique effects of goal setting on behavior change, and (b) under what circumstances and for whom goal setting works best. Four databases were searched for articles that assessed the unique effects of goal setting on behavior change using randomized controlled trials. One-hundred and 41 papers were identified from which 384 effect sizes (N = 16,523) were extracted and analyzed. A moderator analysis of sample characteristics, intervention characteristics, inclusion of other behavior change techniques, study design and delivery, quality of study, outcome measures, and behavior targeted was conducted. A random effects model indicated a small positive unique effect of goal setting across a range of behaviors, d = .34 (CI [.28, .41]). Moderator analyses indicated that goal setting was particularly effective if the goal was: (a) difficult, (b) set publicly, and (c) was a group goal. There was weaker evidence that goal setting was more effective when paired with external monitoring of the behavior/outcome by others without feedback and delivered face-to-face. Goal setting is an effective behavior change technique that has the potential to be considered a fundamental component of successful interventions. The present review adds novel insights into the means by which goal setting might be augmented to maximize behavior change and sets the agenda for future programs of research. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Modelling and extraction technique for micro-doppler signature of aircraft rotor blades

    NASA Astrophysics Data System (ADS)

    Praveen, N.; Valarmathi, J.

    2017-11-01

    The process of detecting and distinguishing between different aircrafts has been a major point of interest in Defence applications. Micro-Doppler effect is one such phenomenon unique for aircrafts with different rotor dynamics and design. In this paper, we focus on deducing a mathematical model for micro-Doppler signature, of aircraft rotor blades assumed to be rotating in a plane perpendicular to the flying direction, induced on the incident radar signal. Also, we use the Wigner-Ville Distribution (WVD) to extract this signature from the radar return. This mathematical model is compared with the simulation results obtained from MATLAB, to validate the results and show the accurateness of the developed model.

  17. Mitigating Issues Related to the Modeling of Insurgent Recruitment

    NASA Astrophysics Data System (ADS)

    Briscoe, Erica; Trewhitt, Ethan; Weiss, Lora; Whitaker, Elizabeth

    Modeling the specific motivations and influences related to an individual's decision to become involved in insurgent warfare presents its own collection of unique challenges. The difficulty of the problem often necessitates simplifications that, while making the task more manageable, may inadvertently 'smooth away' critical aspects of the problem. Augmenting the challenge is that research into the motivations of terrorism has found there is not a definitive set of variables that serve as reliable indicators of an individual's involvement. This paper addresses techniques aimed toward mitigating issues that manifest in the modeling of insurgent recruitment so that these complications do not lessen the viability of models that are used in the prediction and evaluation of terrorist activity.

  18. Analysis of blind identification methods for estimation of kinetic parameters in dynamic medical imaging

    NASA Astrophysics Data System (ADS)

    Riabkov, Dmitri

    Compartment modeling of dynamic medical image data implies that the concentration of the tracer over time in a particular region of the organ of interest is well-modeled as a convolution of the tissue response with the tracer concentration in the blood stream. The tissue response is different for different tissues while the blood input is assumed to be the same for different tissues. The kinetic parameters characterizing the tissue responses can be estimated by blind identification methods. These algorithms use the simultaneous measurements of concentration in separate regions of the organ; if the regions have different responses, the measurement of the blood input function may not be required. In this work it is shown that the blind identification problem has a unique solution for two-compartment model tissue response. For two-compartment model tissue responses in dynamic cardiac MRI imaging conditions with gadolinium-DTPA contrast agent, three blind identification algorithms are analyzed here to assess their utility: Eigenvector-based Algorithm for Multichannel Blind Deconvolution (EVAM), Cross Relations (CR), and Iterative Quadratic Maximum Likelihood (IQML). Comparisons of accuracy with conventional (not blind) identification techniques where the blood input is known are made as well. The statistical accuracies of estimation for the three methods are evaluated and compared for multiple parameter sets. The results show that the IQML method gives more accurate estimates than the other two blind identification methods. A proof is presented here that three-compartment model blind identification is not unique in the case of only two regions. It is shown that it is likely unique for the case of more than two regions, but this has not been proved analytically. For the three-compartment model the tissue responses in dynamic FDG PET imaging conditions are analyzed with the blind identification algorithms EVAM and Separable variables Least Squares (SLS). A method of identification that assumes that FDG blood input in the brain can be modeled as a function of time and several parameters (IFM) is analyzed also. Nonuniform sampling SLS (NSLS) is developed due to the rapid change of the FDG concentration in the blood during the early postinjection stage. Comparisons of accuracy of EVAM, SLS, NSLS and IFM identification techniques are made.

  19. Time-Dependent Reversible-Irreversible Deformation Threshold Determined Explicitly by Experimental Technique

    NASA Technical Reports Server (NTRS)

    Castelli, Michael G.; Arnold, Steven M.

    2000-01-01

    Structural materials for the design of advanced aeropropulsion components are usually subject to loading under elevated temperatures, where a material's viscosity (resistance to flow) is greatly reduced in comparison to its viscosity under low-temperature conditions. As a result, the propensity for the material to exhibit time-dependent deformation is significantly enhanced, even when loading is limited to a quasi-linear stress-strain regime as an effort to avoid permanent (irreversible) nonlinear deformation. An understanding and assessment of such time-dependent effects in the context of combined reversible and irreversible deformation is critical to the development of constitutive models that can accurately predict the general hereditary behavior of material deformation. To this end, researchers at the NASA Glenn Research Center at Lewis Field developed a unique experimental technique that identifies the existence of and explicitly determines a threshold stress k, below which the time-dependent material deformation is wholly reversible, and above which irreversible deformation is incurred. This technique is unique in the sense that it allows, for the first time, an objective, explicit, experimental measurement of k. The underlying concept for the experiment is based on the assumption that the material s time-dependent reversible response is invariable, even in the presence of irreversible deformation.

  20. Advancing solar energy forecasting through the underlying physics

    NASA Astrophysics Data System (ADS)

    Yang, H.; Ghonima, M. S.; Zhong, X.; Ozge, B.; Kurtz, B.; Wu, E.; Mejia, F. A.; Zamora, M.; Wang, G.; Clemesha, R.; Norris, J. R.; Heus, T.; Kleissl, J. P.

    2017-12-01

    As solar power comprises an increasingly large portion of the energy generation mix, the ability to accurately forecast solar photovoltaic generation becomes increasingly important. Due to the variability of solar power caused by cloud cover, knowledge of both the magnitude and timing of expected solar power production ahead of time facilitates the integration of solar power onto the electric grid by reducing electricity generation from traditional ancillary generators such as gas and oil power plants, as well as decreasing the ramping of all generators, reducing start and shutdown costs, and minimizing solar power curtailment, thereby providing annual economic value. The time scales involved in both the energy markets and solar variability range from intra-hour to several days ahead. This wide range of time horizons led to the development of a multitude of techniques, with each offering unique advantages in specific applications. For example, sky imagery provides site-specific forecasts on the minute-scale. Statistical techniques including machine learning algorithms are commonly used in the intra-day forecast horizon for regional applications, while numerical weather prediction models can provide mesoscale forecasts on both the intra-day and days-ahead time scale. This talk will provide an overview of the challenges unique to each technique and highlight the advances in their ongoing development which come alongside advances in the fundamental physics underneath.

  1. Synchrotron-based X-ray microscopic studies for bioeffects of nanomaterials.

    PubMed

    Zhu, Ying; Cai, Xiaoqing; Li, Jiang; Zhong, Zengtao; Huang, Qing; Fan, Chunhai

    2014-04-01

    There have been increasing interests in studying biological effects of nanomaterials, which are nevertheless faced up with many challenges due to the nanoscale dimensions and unique chemical properties of nanomaterials. Synchrotron-based X-ray microscopy, an advanced imaging technology with high spatial resolution and excellent elemental specificity, provides a new platform for studying interactions between nanomaterials and living systems. In this article, we review the recent progress of X-ray microscopic studies on bioeffects of nanomaterials in several living systems including cells, model organisms, animals and plants. We aim to provide an overview of the state of the art, and the advantages of using synchrotron-based X-ray microscopy for characterizing in vitro and in vivo behaviors and biodistribution of nanomaterials. We also expect that the use of a combination of new synchrotron techniques should offer unprecedented opportunities for better understanding complex interactions at the nano-biological interface and accounting for unique bioeffects of nanomaterials. Synchrotron-based X-ray microscopy is a non-destructive imaging technique that enables high resolution spatial mapping of metals with elemental level detection methods. This review summarizes the current use and perspectives of this novel technique in studying the biology and tissue interactions of nanomaterials. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Fagundo, Arturo

    1994-01-01

    Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.

  3. Analytical and Numerical Solutions of Generalized Fokker-Planck Equations - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prinja, Anil K.

    The overall goal of this project was to develop advanced theoretical and numerical techniques to quantitatively describe the spreading of a collimated beam of charged particles in space, in angle, and in energy, as a result of small deflection, small energy transfer Coulomb collisions with the target nuclei and electrons. Such beams arise in several applications of great interest in nuclear engineering, and include electron and ion radiotherapy, ion beam modification of materials, accelerator transmutation of waste, and accelerator production of tritium, to name some important candidates. These applications present unique and difficult modeling challenges, but from the outset aremore » amenable to the language of ''transport theory'', which is very familiar to nuclear engineers and considerably less-so to physicists and material scientists. Thus, our approach has been to adopt a fundamental description based on transport equations, but the forward peakedness associated with charged particle interactions precludes a direct application of solution methods developed for neutral particle transport. Unique problem formulations and solution techniques are necessary to describe the transport and interaction of charged particles. In particular, we have developed the Generalized Fokker-Planck (GFP) approach to describe the angular and radial spreading of a collimated beam and a renormalized transport model to describe the energy-loss straggling of an initially monoenergetic distribution. Both analytic and numerical solutions have been investigated and in particular novel finite element numerical methods have been developed. In the first phase of the project, asymptotic methods were used to develop closed form solutions to the GFP equation for different orders of expansion, and was described in a previous progress report. In this final report we present a detailed description of (i) a novel energy straggling model based on a Fokker-Planck approximation but which is adapted for a multigroup transport setting, and (ii) two unique families of discontinuous finite element schemes, one linear and the other nonlinear.« less

  4. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  5. Finite Dimensional Approximations for Continuum Multiscale Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlyand, Leonid

    2017-01-24

    The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less

  6. An Information-Based Machine Learning Approach to Elasticity Imaging

    PubMed Central

    Hoerig, Cameron; Ghaboussi, Jamshid; Insana, Michael. F.

    2016-01-01

    An information-based technique is described for applications in mechanical-property imaging of soft biological media under quasi-static loads. We adapted the Autoprogressive method that was originally developed for civil engineering applications for this purpose. The Autoprogressive method is a computational technique that combines knowledge of object shape and a sparse distribution of force and displacement measurements with finite-element analyses and artificial neural networks to estimate a complete set of stress and strain vectors. Elasticity imaging parameters are then computed from estimated stresses and strains. We introduce the technique using ultrasonic pulse-echo measurements in simple gelatin imaging phantoms having linear-elastic properties so that conventional finite-element modeling can be used to validate results. The Autoprogressive algorithm does not require any assumptions about the material properties and can, in principle, be used to image media with arbitrary properties. We show that by selecting a few well-chosen force-displacement measurements that are appropriately applied during training and establish convergence, we can estimate all nontrivial stress and strain vectors throughout an object and accurately estimate an elastic modulus at high spatial resolution. This new method of modeling the mechanical properties of tissue-like materials introduces a unique method of solving the inverse problem and is the first technique for imaging stress without assuming the underlying constitutive model. PMID:27858175

  7. A unique control system simulator for the evaluation of pulsed plasma thrusters

    NASA Technical Reports Server (NTRS)

    Dahlgren, J. B.

    1973-01-01

    Because of the low thrust characteristics of solid-propellant pulsed plasma thrusters and their operational requirement to operate in a vacuum environment, unique and sensitive test techniques are required. A technique evolved for testing and evaluating pulsed plasma thrusters in an open- or closed-loop system mode employs a unique air bearing platform as a single-axis simulator on which the thruster is mounted. The simulator described was developed to evaluate pulsed plasma thrusters in the low micropound range; however, the simulator can be extended to cover the operational range of currently developed millipound thrusters.

  8. Apparatus, Method, and Computer Program for a Resolution-Enhanced Pseudo-Noise Code Technique

    NASA Technical Reports Server (NTRS)

    Li, Steven X. (Inventor)

    2015-01-01

    An apparatus, method, and computer program for a resolution enhanced pseudo-noise coding technique for 3D imaging is provided. In one embodiment, a pattern generator may generate a plurality of unique patterns for a return to zero signal. A plurality of laser diodes may be configured such that each laser diode transmits the return to zero signal to an object. Each of the return to zero signal includes one unique pattern from the plurality of unique patterns to distinguish each of the transmitted return to zero signals from one another.

  9. Comparison of closed loop model with flight test results

    NASA Technical Reports Server (NTRS)

    George, F. L.

    1981-01-01

    An analytic technique capable of predicting the landing characteristics of proposed aircraft configurations in the early stages of design was developed. In this analysis, a linear pilot-aircraft closed loop model was evaluated using experimental data generated with the NT-33 variable stability in-flight simulator. The pilot dynamics are modeled as inner and outer servo loop closures around aircraft pitch attitude, and altitude rate-of-change respectively. The landing flare maneuver is of particular interest as recent experience with military and other highly augmented vehicles shows this task to be relatively demanding, and potentially a critical design point. A unique feature of the pilot model is the incorporation of an internal model of the pilot's desired flight path for the flare maneuver.

  10. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2009-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  11. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2010-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  12. Identifying viscoelastic parameters of tissue specimens using Hertz contact mechanics

    NASA Astrophysics Data System (ADS)

    Namiri, Nikan K.; Maccabi, Ashkan; Bajwa, Neha; Badran, Karam W.; St. John, Maie A.; Taylor, Zachary D.; Grundfest, Warren S.; Saddik, George N.

    2018-02-01

    The unique viscoelastic properties of tissues throughout the human body can be utilized in a variety of clinical applications. Palpation techniques, for instance, enable surgeons to distinguish malignancies in tissue composition during surgical procedures. Additionally, imaging devices have begun utilizing the viscoelastic properties of tissue to delineate tumor margins. Vibroacoustography (VA), a non-invasive, high resolution imaging modality, has the ability to detect sub-millimeter differences in tissue composition. VA images tissue using a low frequency acoustic radiation force, which perturbs the target and causes an acoustic response that is dependent on the target's viscoelastic properties. Given the unique properties specific to human and animal tissues, there are far-reaching clinical applications of VA. To date, however, a comprehensive model that relates viscoelasticity to VA tissue response has yet to be developed. Utilizing tissue-mimicking phantoms (TMPs) and fresh ex vivo tissues, a mechanical stress relaxation model was developed to compare the viscoelastic properties of known and unknown specimens. This approach was conducted using the Hertz theory of contact mechanics. Fresh hepatic tissue was obtained from porcine subjects (n=10), while gelatin and agar TMPs (n=12) were fabricated from organic extracts. Each specimen's elastic modulus (E), long term shear modulus (η), and time constant (τ) were found to be unique. Additionally, each specimen's stress relaxation profiles were analyzed using Weichert-Maxwell viscoelastic modeling, and retained high precision (R2>0.9) among all samples.

  13. Numerical model for learning concepts of streamflow simulation

    USGS Publications Warehouse

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  14. Global Mittag-Leffler stability and synchronization analysis of fractional-order quaternion-valued neural networks with linear threshold neurons.

    PubMed

    Yang, Xujun; Li, Chuandong; Song, Qiankun; Chen, Jiyang; Huang, Junjian

    2018-05-04

    This paper talks about the stability and synchronization problems of fractional-order quaternion-valued neural networks (FQVNNs) with linear threshold neurons. On account of the non-commutativity of quaternion multiplication resulting from Hamilton rules, the FQVNN models are separated into four real-valued neural network (RVNN) models. Consequently, the dynamic analysis of FQVNNs can be realized by investigating the real-valued ones. Based on the method of M-matrix, the existence and uniqueness of the equilibrium point of the FQVNNs are obtained without detailed proof. Afterwards, several sufficient criteria ensuring the global Mittag-Leffler stability for the unique equilibrium point of the FQVNNs are derived by applying the Lyapunov direct method, the theory of fractional differential equation, the theory of matrix eigenvalue, and some inequality techniques. In the meanwhile, global Mittag-Leffler synchronization for the drive-response models of the addressed FQVNNs are investigated explicitly. Finally, simulation examples are designed to verify the feasibility and availability of the theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Going beyond the unitary curve: incorporating richer cognition into agent-based water resources models

    NASA Astrophysics Data System (ADS)

    Kock, B. E.

    2008-12-01

    The increased availability and understanding of agent-based modeling technology and techniques provides a unique opportunity for water resources modelers, allowing them to go beyond traditional behavioral approaches from neoclassical economics, and add rich cognition to social-hydrological models. Agent-based models provide for an individual focus, and the easier and more realistic incorporation of learning, memory and other mechanisms for increased cognitive sophistication. We are in an age of global change impacting complex water resources systems, and social responses are increasingly recognized as fundamentally adaptive and emergent. In consideration of this, water resources models and modelers need to better address social dynamics in a manner beyond the capabilities of neoclassical economics theory and practice. However, going beyond the unitary curve requires unique levels of engagement with stakeholders, both to elicit the richer knowledge necessary for structuring and parameterizing agent-based models, but also to make sure such models are appropriately used. With the aim of encouraging epistemological and methodological convergence in the agent-based modeling of water resources, we have developed a water resources-specific cognitive model and an associated collaborative modeling process. Our cognitive model emphasizes efficiency in architecture and operation, and capacity to adapt to different application contexts. We describe a current application of this cognitive model and modeling process in the Arkansas Basin of Colorado. In particular, we highlight the potential benefits of, and challenges to, using more sophisticated cognitive models in agent-based water resources models.

  16. Geologic CO2 Sequestration: Predicting and Confirming Performance in Oil Reservoirs and Saline Aquifers

    NASA Astrophysics Data System (ADS)

    Johnson, J. W.; Nitao, J. J.; Newmark, R. L.; Kirkendall, B. A.; Nimz, G. J.; Knauss, K. G.; Ziagos, J. P.

    2002-05-01

    Reducing anthropogenic CO2 emissions ranks high among the grand scientific challenges of this century. In the near-term, significant reductions can only be achieved through innovative sequestration strategies that prevent atmospheric release of large-scale CO2 waste streams. Among such strategies, injection into confined geologic formations represents arguably the most promising alternative; and among potential geologic storage sites, oil reservoirs and saline aquifers represent the most attractive targets. Oil reservoirs offer a unique "win-win" approach because CO2 flooding is an effective technique of enhanced oil recovery (EOR), while saline aquifers offer immense storage capacity and widespread distribution. Although CO2-flood EOR has been widely used in the Permian Basin and elsewhere since the 1980s, the oil industry has just recently become concerned with the significant fraction of injected CO2 that eludes recycling and is therefore sequestered. This "lost" CO2 now has potential economic value in the growing emissions credit market; hence, the industry's emerging interest in recasting CO2 floods as co-optimized EOR/sequestration projects. The world's first saline aquifer storage project was also catalyzed in part by economics: Norway's newly imposed atmospheric emissions tax, which spurred development of Statoil's unique North Sea Sleipner facility in 1996. Successful implementation of geologic sequestration projects hinges on development of advanced predictive models and a diverse set of remote sensing, in situ sampling, and experimental techniques. The models are needed to design and forecast long-term sequestration performance; the monitoring techniques are required to confirm and refine model predictions and to ensure compliance with environmental regulations. We have developed a unique reactive transport modeling capability for predicting sequestration performance in saline aquifers, and used it to simulate CO2 injection at Sleipner; we are now extending this capability to address CO2-flood EOR/sequestration in oil reservoirs. We have also developed a suite of innovative geophysical and geochemical techniques for monitoring sequestration performance in both settings. These include electromagnetic induction imaging and electrical resistance tomography for tracking migration of immiscible CO2, noble gas isotopes for assessing trace CO2 leakage through the cap rock, and integrated geochemical sampling, analytical, and experimental methods for determining sequestration partitioning among solubility and mineral trapping mechanisms. We have proposed to demonstrate feasibility of the co-optimized EOR/sequestration concept and utility of our modeling and monitoring technologies to design and evaluate its implementation by conducting a demonstration project in the Livermore Oil Field. This small, mature, shallow field, located less than a mile east of Lawrence Livermore National Laboratory, is representative of many potential EOR/sequestration sites in California. In approach, this proposed demonstration is analogous to the Weyburn EOR/CO2 monitoring project, to which it will provide an important complement by virtue of its contrasting depth (immiscible versus Weyburn's miscible CO2 flood) and geologic setting (clay-capped sand versus Weyburn's anhydrite-capped carbonate reservoir).

  17. Thermoacoustic tomography for an integro-differential wave equation modeling attenuation

    NASA Astrophysics Data System (ADS)

    Acosta, Sebastián; Palacios, Benjamín

    2018-02-01

    In this article we study the inverse problem of thermoacoustic tomography (TAT) on a medium with attenuation represented by a time-convolution (or memory) term, and whose consideration is motivated by the modeling of ultrasound waves in heterogeneous tissue via fractional derivatives with spatially dependent parameters. Under the assumption of being able to measure data on the whole boundary, we prove uniqueness and stability, and propose a convergent reconstruction method for a class of smooth variable sound speeds. By a suitable modification of the time reversal technique, we obtain a Neumann series reconstruction formula.

  18. Templated biomimetic multifunctional coatings

    NASA Astrophysics Data System (ADS)

    Sun, Chih-Hung; Gonzalez, Adriel; Linn, Nicholas C.; Jiang, Peng; Jiang, Bin

    2008-02-01

    We report a bioinspired templating technique for fabricating multifunctional optical coatings that mimic both unique functionalities of antireflective moth eyes and superhydrophobic cicada wings. Subwavelength-structured fluoropolymer nipple arrays are created by a soft-lithography-like process. The utilization of fluoropolymers simultaneously enhances the antireflective performance and the hydrophobicity of the replicated films. The specular reflectivity matches the optical simulation using a thin-film multilayer model. The dependence of the size and the crystalline ordering of the replicated nipples on the resulting antireflective properties have also been investigated by experiment and modeling. These biomimetic materials may find important technological application in self-cleaning antireflection coatings.

  19. Outreach and Astronomy-Education Activities of the University of Arizona Astronomy Club

    NASA Astrophysics Data System (ADS)

    McGraw, Allison M.; Hardegree-Ullman, K.; Walker-LaFollette, A.; Towner, A. P.

    2014-01-01

    The University of Arizona Astronomy Club provides unique outreach experiences for all ages. Our undergraduates work together to volunteer their time for various types of outreach events. This club uses several techniques to execute astronomy education such as hands-on 3D models, exciting demonstrations of scientific phenomena, and multiple small telescopes for both solar and night-time viewing. The students bring the models and telescopes to locations both on and off campus; from dark sky locations in the desert southwest to elementary schools, our undergraduates are willing to teach astronomy just about anywhere.

  20. Pilot-optimal augmentation synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1978-01-01

    An augmentation synthesis method usable in the absence of quantitative handling qualities specifications, and yet explicitly including design objectives based on pilot-rating concepts, is presented. The algorithm involves the unique approach of simultaneously solving for the stability augmentation system (SAS) gains, pilot equalization and pilot rating prediction via optimal control techniques. Simultaneous solution is required in this case since the pilot model (gains, etc.) depends upon the augmented plant dynamics, and the augmentation is obviously not a priori known. Another special feature is the use of the pilot's objective function (from which the pilot model evolves) to design the SAS.

  1. The art of spacecraft design: A multidisciplinary challenge

    NASA Technical Reports Server (NTRS)

    Abdi, F.; Ide, H.; Levine, M.; Austel, L.

    1989-01-01

    Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.

  2. The Measurement of Sulfur Oxidation Products and Their Role in Homogeneous Nucleation

    NASA Technical Reports Server (NTRS)

    Eisele, F. L.

    1999-01-01

    An improved version of a transverse ion source was developed which uses selected ion chemical ionization mass spectrometry techniques inside of a particle nucleation flow tube. These new techniques are very unique, in that the chemical ionization is done inside of the flow tube rather than by having to remove the compounds and clusters of interest which are lost on first contact,with any surfaces. The transverse source is also unique because it allows the ion reaction time to be varied over more than an order of magnitude, which in turn makes possible the separation of ion induced cluster growth from the charging of preexisting molecular clusters. As a result of combining these unique capabilities, the first ever measurements of prenucleation molecular clusters were performed. These clusters are the intermediate stage of growth in the gas-to-particle conversion process. This new technique provides a means of observing clusters containing 2, 3, 4, ... and up to about 8 sulfuric acid molecules, where the critical cluster size under these measurement conditions was about 4 or 5. Thus, the nucleation process can now be directly observed and even growth beyond the critical cluster size can be investigated. The details of this investigation are discussed in a recently submitted paper, which is included as Appendix A. Measurements of the diffusion coefficient of sulfuric acid and sulfuric acid clustered with a water molecule have also been performed. The measurements are also discussed in more detail in another recently submitted paper which is included as Appendix B. The empirical results discussed in both of these papers provide a critical test of present nucleation theories. They also provide new hope for resolving many of the huge discrepancies between field observation and model prediction of particle nucleation. The second part of the research conducted under this project was directed towards the development of new chemical ionization techniques for measuring sulfur oxidation products.

  3. Using the SWAT model to improve process descriptions and define hydrologic partitioning in South Korea

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2014-02-01

    Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.

  4. Proceedings of the Numerical Modeling for Underground Nuclear Test Monitoring Symposium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, S.R.; Kamm, J.R.

    1993-11-01

    The purpose of the meeting was to discuss the state-of-the-art in numerical simulations of nuclear explosion phenomenology with applications to test ban monitoring. We focused on the uniqueness of model fits to data, the measurement and characterization of material response models, advanced modeling techniques, and applications of modeling to monitoring problems. The second goal of the symposium was to establish a dialogue between seismologists and explosion-source code calculators. The meeting was divided into five main sessions: explosion source phenomenology, material response modeling, numerical simulations, the seismic source, and phenomenology from near source to far field. We feel the symposium reachedmore » many of its goals. Individual papers submitted at the conference are indexed separately on the data base.« less

  5. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  6. Single-molecule imaging in live bacteria cells.

    PubMed

    Ritchie, Ken; Lill, Yoriko; Sood, Chetan; Lee, Hochan; Zhang, Shunyuan

    2013-02-05

    Bacteria, such as Escherichia coli and Caulobacter crescentus, are the most studied and perhaps best-understood organisms in biology. The advances in understanding of living systems gained from these organisms are immense. Application of single-molecule techniques in bacteria have presented unique difficulties owing to their small size and highly curved form. The aim of this review is to show advances made in single-molecule imaging in bacteria over the past 10 years, and to look to the future where the combination of implementing such high-precision techniques in well-characterized and controllable model systems such as E. coli could lead to a greater understanding of fundamental biological questions inaccessible through classic ensemble methods.

  7. Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model

    NASA Astrophysics Data System (ADS)

    Kassebaum, Paul G.; Iannacchione, Germano S.

    The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.

  8. The Sixth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Sixth Annual Thermal and Fluids Analysis Workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysis. Paper topics included advances an uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  9. Ground-truthing electrical resistivity methods in support of submarine groundwater discharge studies: Examples from Hawaii, Washington, and California

    USGS Publications Warehouse

    Johnson, Cordell; Swarzenski, Peter W.; Richardson, Christina M.; Smith, Christopher G.; Kroeger, Kevin D.; Ganguli, Priya M.

    2015-01-01

    Rigorous ground-truthing at each field site showed that multi-channel electrcial resistivity techniques can reproduce the scales and dynamics of a seepage field when such data are correctly collected, and when the model inversions are tuned to field site characteristics. Such information can provide a unique perspective on the scales and dynamics of exchange processes within a coastal aquifer—information essential to scientists and resource managers alike.

  10. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  11. Consensus models to predict endocrine disruption for all ...

    EPA Pesticide Factsheets

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte

  12. Category representations in the brain are both discretely localized and widely distributed.

    PubMed

    Shehzad, Zarrar; McCarthy, Gregory

    2018-06-01

    Whether category information is discretely localized or represented widely in the brain remains a contentious issue. Initial functional MRI studies supported the localizationist perspective that category information is represented in discrete brain regions. More recent fMRI studies using machine learning pattern classification techniques provide evidence for widespread distributed representations. However, these latter studies have not typically accounted for shared information. Here, we find strong support for distributed representations when brain regions are considered separately. However, localized representations are revealed by using analytical methods that separate unique from shared information among brain regions. The distributed nature of shared information and the localized nature of unique information suggest that brain connectivity may encourage spreading of information but category-specific computations are carried out in distinct domain-specific regions. NEW & NOTEWORTHY Whether visual category information is localized in unique domain-specific brain regions or distributed in many domain-general brain regions is hotly contested. We resolve this debate by using multivariate analyses to parse functional MRI signals from different brain regions into unique and shared variance. Our findings support elements of both models and show information is initially localized and then shared among other regions leading to distributed representations being observed.

  13. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  14. 3D Ultrasonic Wave Simulations for Structural Health Monitoring

    NASA Technical Reports Server (NTRS)

    Campbell, Leckey Cara A/; Miler, Corey A.; Hinders, Mark K.

    2011-01-01

    Structural health monitoring (SHM) for the detection of damage in aerospace materials is an important area of research at NASA. Ultrasonic guided Lamb waves are a promising SHM damage detection technique since the waves can propagate long distances. For complicated flaw geometries experimental signals can be difficult to interpret. High performance computing can now handle full 3-dimensional (3D) simulations of elastic wave propagation in materials. We have developed and implemented parallel 3D elastodynamic finite integration technique (3D EFIT) code to investigate ultrasound scattering from flaws in materials. EFIT results have been compared to experimental data and the simulations provide unique insight into details of the wave behavior. This type of insight is useful for developing optimized experimental SHM techniques. 3D EFIT can also be expanded to model wave propagation and scattering in anisotropic composite materials.

  15. Modern Geodetic Measurement Techniques in Gravimetric Studies on the Example of Gypsum Karst in the Siesławice Region

    NASA Astrophysics Data System (ADS)

    Porzucek, Sławomir; Łój, Monika; Matwij, Karolina; Matwij, Wojciech

    2018-03-01

    In the region of Siesławice (near Busko-Zdrój, Poland) there are unique phenomena of gypsum karst. Atmospheric factors caused numerous gypsum outcrops, canals and underground voids. The article presents the possibility of using non-invasive gravimetric surveys supplemented with geodetic measurements to illustrate karst changes occurring around the void. The use of modern geodetic measurement techniques including terrestrial and airborne laser scanning enables to generate a digital terrain model and a three-dimensional model of voids. Gravimetric field studies allowed to map the anomalies of the gravitational field of the near-surface zone. Geodetic measurement results have made it possible to accurately determine the terrain correction that supplemented the gravimetric anomaly information. Geophysical interpretation indicate the presence of weathered rocks in the near surface zone and fractures and loosened zones located surround the karst cave.

  16. A computational visual saliency model based on statistics and machine learning.

    PubMed

    Lin, Ru-Je; Lin, Wei-Song

    2014-08-01

    Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.

  17. Time Domain Estimation of Arterial Parameters using the Windkessel Model and the Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Gostuski, Vladimir; Pastore, Ignacio; Rodriguez Palacios, Gaspar; Vaca Diez, Gustavo; Moscoso-Vasquez, H. Marcela; Risk, Marcelo

    2016-04-01

    Numerous parameter estimation techniques exist for characterizing the arterial system using electrical circuit analogs. However, they are often limited by their requirements and usually high computational burdain. Therefore, a new method for estimating arterial parameters based on Monte Carlo simulation is proposed. A three element Windkessel model was used to represent the arterial system. The approach was to reduce the error between the calculated and physiological aortic pressure by randomly generating arterial parameter values, while keeping constant the arterial resistance. This last value was obtained for each subject using the arterial flow, and was a necessary consideration in order to obtain a unique set of values for the arterial compliance and peripheral resistance. The estimation technique was applied to in vivo data containing steady beats in mongrel dogs, and it reliably estimated Windkessel arterial parameters. Further, this method appears to be computationally efficient for on-line time-domain estimation of these parameters.

  18. Diagnostic tools for nearest neighbors techniques when used with satellite imagery

    Treesearch

    Ronald E. McRoberts

    2009-01-01

    Nearest neighbors techniques are non-parametric approaches to multivariate prediction that are useful for predicting both continuous and categorical forest attribute variables. Although some assumptions underlying nearest neighbor techniques are common to other prediction techniques such as regression, other assumptions are unique to nearest neighbor techniques....

  19. Creation of a novel simulator for minimally invasive neurosurgery: fusion of 3D printing and special effects.

    PubMed

    Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R

    2017-07-01

    OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.

  20. A field technique for estimating aquifer parameters using flow log data

    USGS Publications Warehouse

    Paillet, Frederick L.

    2000-01-01

    A numerical model is used to predict flow along intervals between producing zones in open boreholes for comparison with measurements of borehole flow. The model gives flow under quasi-steady conditions as a function of the transmissivity and hydraulic head in an arbitrary number of zones communicating with each other along open boreholes. The theory shows that the amount of inflow to or outflow from the borehole under any one flow condition may not indicate relative zone transmissivity. A unique inversion for both hydraulic-head and transmissivity values is possible if flow is measured under two different conditions such as ambient and quasi-steady pumping, and if the difference in open-borehole water level between the two flow conditions is measured. The technique is shown to give useful estimates of water levels and transmissivities of two or more water-producing zones intersecting a single interval of open borehole under typical field conditions. Although the modeling technique involves some approximation, the principle limit on the accuracy of the method under field conditions is the measurement error in the flow log data. Flow measurements and pumping conditions are usually adjusted so that transmissivity estimates are most accurate for the most transmissive zones, and relative measurement error is proportionately larger for less transmissive zones. The most effective general application of the borehole-flow model results when the data are fit to models that systematically include more production zones of progressively smaller transmissivity values until model results show that all accuracy in the data set is exhausted.A numerical model is used to predict flow along intervals between producing zones in open boreholes for comparison with measurements of borehole flow. The model gives flow under quasi-steady conditions as a function of the transmissivity and hydraulic head in an arbitrary number of zones communicating with each other along open boreholes. The theory shows that the amount of inflow to or outflow from the borehole under any one flow condition may not indicate relative zone transmissivity. A unique inversion for both hydraulic-head and transmissivity values is possible if flow is measured under two different conditions such as ambient and quasi-steady pumping, and if the difference in open-borehole water level between the two flow conditions is measured. The technique is shown to give useful estimates of water levels and transmissivities of two or more water-producing zones intersecting a single interval of open borehole under typical field conditions. Although the modeling technique involves some approximation, the principle limit on the accuracy of the method under field conditions is the measurement error in the flow log data. Flow measurements and pumping conditions are usually adjusted so that transmissivity estimates are most accurate for the most transmissive zones, and relative measurement error is proportionately larger for less transmissive zones. The most effective general application of the borehole-flow model results when the data are fit to models that symmetrically include more production zones of progressively smaller transmissivity values until model results show that all accuracy in the data set is exhausted.

  1. Surface tension models for a multi-material ALE code with AMR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Wangyi; Koniges, Alice; Gott, Kevin

    A number of surface tension models have been implemented in a 3D multi-physics multi-material code, ALE–AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR). ALE–AMR is unique in its ability to model hot radiating plasmas, cold fragmenting solids, and most recently, the deformation of molten material. The surface tension models implemented include a diffuse interface approach with special numerical techniques to remove parasitic flow and a height function approach in conjunction with a volume-fraction interface reconstruction package. These surface tension models are benchmarked with a variety of test problems. In conclusion, based on the results, themore » height function approach using volume fractions was chosen to simulate droplet dynamics associated with extreme ultraviolet (EUV) lithography.« less

  2. Surface tension models for a multi-material ALE code with AMR

    DOE PAGES

    Liu, Wangyi; Koniges, Alice; Gott, Kevin; ...

    2017-06-01

    A number of surface tension models have been implemented in a 3D multi-physics multi-material code, ALE–AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR). ALE–AMR is unique in its ability to model hot radiating plasmas, cold fragmenting solids, and most recently, the deformation of molten material. The surface tension models implemented include a diffuse interface approach with special numerical techniques to remove parasitic flow and a height function approach in conjunction with a volume-fraction interface reconstruction package. These surface tension models are benchmarked with a variety of test problems. In conclusion, based on the results, themore » height function approach using volume fractions was chosen to simulate droplet dynamics associated with extreme ultraviolet (EUV) lithography.« less

  3. A Method for Calculating the Probability of Successfully Completing a Rocket Propulsion Ground Test

    NASA Technical Reports Server (NTRS)

    Messer, Bradley

    2007-01-01

    Propulsion ground test facilities face the daily challenge of scheduling multiple customers into limited facility space and successfully completing their propulsion test projects. Over the last decade NASA s propulsion test facilities have performed hundreds of tests, collected thousands of seconds of test data, and exceeded the capabilities of numerous test facility and test article components. A logistic regression mathematical modeling technique has been developed to predict the probability of successfully completing a rocket propulsion test. A logistic regression model is a mathematical modeling approach that can be used to describe the relationship of several independent predictor variables X(sub 1), X(sub 2),.., X(sub k) to a binary or dichotomous dependent variable Y, where Y can only be one of two possible outcomes, in this case Success or Failure of accomplishing a full duration test. The use of logistic regression modeling is not new; however, modeling propulsion ground test facilities using logistic regression is both a new and unique application of the statistical technique. Results from this type of model provide project managers with insight and confidence into the effectiveness of rocket propulsion ground testing.

  4. Genetic programming assisted stochastic optimization strategies for optimization of glucose to gluconic acid fermentation.

    PubMed

    Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D

    2002-01-01

    This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.

  5. Advanced statistics: linear regression, part II: multiple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  6. Four-Dimensional Data Assimilation Using the Adjoint Method

    NASA Astrophysics Data System (ADS)

    Bao, Jian-Wen

    The calculus of variations is used to confirm that variational four-dimensional data assimilation (FDDA) using the adjoint method can be implemented when the numerical model equations have a finite number of first-order discontinuous points. These points represent the on/off switches associated with physical processes, for which the Jacobian matrix of the model equation does not exist. Numerical evidence suggests that, in some situations when the adjoint method is used for FDDA, the temperature field retrieved using horizontal wind data is numerically not unique. A physical interpretation of this type of non-uniqueness of the retrieval is proposed in terms of energetics. The adjoint equations of a numerical model can also be used for model-parameter estimation. A general computational procedure is developed to determine the size and distribution of any internal model parameter. The procedure is then applied to a one-dimensional shallow -fluid model in the context of analysis-nudging FDDA: the weighting coefficients used by the Newtonian nudging technique are determined. The sensitivity of these nudging coefficients to the optimal objectives and constraints is investigated. Experiments of FDDA using the adjoint method are conducted using the dry version of the hydrostatic Penn State/NCAR mesoscale model (MM4) and its adjoint. The minimization procedure converges and the initialization experiment is successful. Temperature-retrieval experiments involving an assimilation of the horizontal wind are also carried out using the adjoint of MM4.

  7. unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, Ian J.; Chandler, Richard B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientific questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mechanisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unified modeling interface. The R package unmarked provides such a unified modeling framework, including tools for data exploration, model fitting, model criticism, post-hoc analysis, and model comparison.

  8. Application of 3D models of palatal rugae to personal identification: hints at identification from 3D-3D superimposition techniques.

    PubMed

    Gibelli, Daniele; De Angelis, Danilo; Pucciarelli, Valentina; Riboli, Francesco; Ferrario, Virgilio F; Dolci, Claudia; Sforza, Chiarella; Cattaneo, Cristina

    2017-11-20

    Palatal rugae are known in literature as individualizing anatomical structures with a strong potential for personal identification. However, a 3D assessment of their uniqueness has not yet been performed. The present study aims at verifying the uniqueness of 3D models of the palate. Twenty-six subjects were recruited among the orthodontic patients of a private dental office; from every patient, at least two dental casts were taken in different time periods, for a total of 62 casts. Dental casts were digitized by a 3D laser scanner (iSeries, Dental Wings©, Montreal, Canada). The palatal area was identified, and a series of 250 superimpositions was then performed automatically through VAM©software in order to reach the minimum point-to point distance between two models. In 36 matches the models belonged to the same individual, whereas in 214 mismatches they came from different subjects. The RMS (root mean square) of point-to-point distances was then calculated by 3D software. Possible statistically significant differences were assessed through Mann-Whitney test (p < 0.05). Results showed a statistically significant difference in RMS mean point-to-point distance between matches (mean 0.26 mm; SD 0.12) and mismatches (mean 1.30; SD 0.44) (p < 0.0001).All matches reached an RMS value below 0.50 mm. This study first provided an assessment of uniqueness of palatal rugae, based on their anatomical 3D conformations, with consequent applications to personal identification.

  9. Infrared Heterodyne Spectroscopy and its Unique Application to Planetary Studies

    NASA Technical Reports Server (NTRS)

    Kostiuk, Theodore

    2009-01-01

    Since the early 1970's the infrared heterodyne technique has evolved into a powerful tool for the study of molecular constituents, temperatures, and dynamics in planetary atmospheres. Its extremely high spectral resolution (Lambda/(Delta)Lambda/>10(exp 6)) and highly accurate frequency measurement (to 1 part in 10(exp 8)) enabled the detection of nonthermal/natural lasing phenomena on Mars and Venus; direct measurements of winds on Venus, Mars, and Titan; study of mid-infrared aurorae on Jupiter; direct measurement of species abundances on Mars (ozone, isotopic CO2), hydrocarbons on Jupiter, Saturn., Neptune, and Titan, and stratospheric composition in the Earth's stratosphere (O3, CIO, N2O, CO2 ....). Fully resolved emission and absorption line shapes measured by this method enabled the unambiguous retrieval of molecular abundances and local temperatures and thermal structure in regions not probed by other techniques. The mesosphere of Mars and thermosphere of Venus are uniquely probed by infrared heterodyne spectroscopy. Results of these studies tested and constrained photochemical and dynamical theoretical models describing the phenomena measured. The infrared heterodyne technique will be described. Highlights in its evolution to today's instrumentation and resultant discoveries will be presented, including work at Goddard Space Flight Center and the University of Koln. Resultant work will include studies supporting NASA and ESA space missions and collaborations between instrumental and theoretical groups.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, Michael A.; Lyubinetsky, Igor

    The field of heterogeneous photocatalysis has grown considerably in the decades since Fujishima and Honda's ground-breaking publications of photoelectrochemistry on TiO2. Numerous review articles continue to point to both progress made in the use of heterogeneous materials (such as TiO2) to perform photoconversion processes, and the many opportunities and challenges in heterogeneous photocatalysis research such as solar energy conversion and environmental remediation. The past decade has also seen an increase in the use of molecular-level approaches applied to model single crystal surfaces in an effort to obtain new insights into photocatalytic phenomena. In particular, scanning probe techniques (SPM) have enabledmore » researchers to take a ‘nanoscale’ approach to photocatalysis that includes interrogation of the reactivities of specific sites and adsorbates on a model photocatalyst surface. The rutile TiO2(110) surface has become the prototypical oxide single crystal surface for fundamental studies of many interfacial phenomena. In particular, TiO2(110) has become an excellent model surface for probing photochemical and photocatalytic reactions at the molecular level. A variety of experimental approaches have emerged as being ideally suited for studying photochemical reactions on TiO2(110), including desorption-oriented approaches and electronic spectroscopies, but perhaps the most promising techniques for evaluating site-specific properties are those of SPM. In this review, we highlight the growing use of SPM techniques in providing molecular-level insights into surface photochemistry on the model photocatalyst surface of rutile TiO2(110). Our objective is to both illustrate the unique knowledge that scanning probe techniques have already provided the field of photocatalysis, and also to motivate a new generation of effort into the use of such approaches to obtain new insights into the molecular level details of photochemical events occurring at interfaces. Discussion will start with an examination of how scanning probe techniques are being used to characterize the TiO2(110) surface in ways that are relevant to photocatalysis. We will then discuss specific classes of photochemical reaction on TiO2(110) for which SPM has proven indispensible in providing unique molecular-level insights, and conclude with discussion of future areas in which SPM studies may prove valuable to photocatalysis on TiO2. This work was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. I.L. was partially supported by a Pacific Northwest National Laboratory (PNNL) Chemical Imaging Initiative project. PNNL is a multiprogram national laboratory operated for DOE by Battelle.« less

  11. Feature selection through validation and un-censoring of endovascular repair survival data for predicting the risk of re-intervention.

    PubMed

    Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong

    2017-08-03

    Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0.05, meaning that patients with different risk groups can be separated significantly and those who would need re-intervention can be correctly predicted. The proposed approach will save time and effort made by physicians to collect unnecessary variables. The final reduced model was able to predict the long-term risk of aortic complications after EVAR. This predictive model can help clinicians decide patients' future observation plan.

  12. Dynamics of the stochastic low concentration trimolecular oscillatory chemical system with jumps

    NASA Astrophysics Data System (ADS)

    Wei, Yongchang; Yang, Qigui

    2018-06-01

    This paper is devoted to discern long time dynamics through the stochastic low concentration trimolecular oscillatory chemical system with jumps. By Lyapunov technique, this system is proved to have a unique global positive solution, and the asymptotic stability in mean square of such model is further established. Moreover, the existence of random attractor and Lyapunov exponents are obtained for the stochastic homeomorphism flow generated by the corresponding global positive solution. And some numerical simulations are given to illustrate the presented results.

  13. System capacity and economic modeling computer tool for satellite mobile communications systems

    NASA Technical Reports Server (NTRS)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  14. Kaizen: a process improvement model for the business of health care and perioperative nursing professionals.

    PubMed

    Tetteh, Hassan A

    2012-01-01

    Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.

  15. Four-dimensional world-wide atmospheric models (surface to 25 km altitude)

    NASA Technical Reports Server (NTRS)

    Spiegler, D. B.; Fowler, M. G.

    1972-01-01

    Four-dimensional atmospheric models previously developed for use as input to atmospheric attenuation models are evaluated to determine where refinements are warranted. The models are refined where appropriate. A computerized technique is developed that has the unique capability of extracting mean monthly and daily variance profiles of moisture, temperature, density and pressure at 1 km intervals to the height of 25 km for any location on the globe. This capability could be very useful to planners of remote sensing of earth resources missions in that the profiles may be used as input to the attenuation models that predict the expected degradation of the sensor data. Recommendations are given for procedures to use the four-dimensional models in computer mission simulations and for the approach to combining the information provided by the 4-D models with that given by the global models.

  16. Machine Learning Predictions of a Multiresolution Climate Model Ensemble

    NASA Astrophysics Data System (ADS)

    Anderson, Gemma J.; Lucas, Donald D.

    2018-05-01

    Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.

  17. Rediscovering the chick embryo as a model to study retinal development

    PubMed Central

    2012-01-01

    The embryonic chick occupies a privileged place among animal models used in developmental studies. Its rapid development and accessibility for visualization and experimental manipulation are just some of the characteristics that have made it a vertebrate model of choice for more than two millennia. Until a few years ago, the inability to perform genetic manipulations constituted a major drawback of this system. However, the completion of the chicken genome project and the development of techniques to manipulate gene expression have allowed this classic animal model to enter the molecular age. Such techniques, combined with the embryological manipulations that this system is well known for, provide a unique toolkit to study the genetic basis of neural development. A major advantage of these approaches is that they permit targeted gene misexpression with extremely high spatiotemporal resolution and over a large range of developmental stages, allowing functional analysis at a level, speed and ease that is difficult to achieve in other systems. This article provides a general overview of the chick as a developmental model focusing more specifically on its application to the study of eye development. Special emphasis is given to the state of the art of the techniques that have made gene gain- and loss-of-function studies in this model a reality. In addition, we discuss some methodological considerations derived from our own experience that we believe will be beneficial to researchers working with this system. PMID:22738172

  18. [Cornea transplant].

    PubMed

    Garralda, A; Epelde, A; Iturralde, O; Compains, E; Maison, C; Altarriba, M; Goldaracena, M B; Maraví-Poma, E

    2006-01-01

    The keratoplasty, or cornea transplant, is one of the oldest surgical techniques in opthalmology, whose indication are: 1) tectonic, in order to preserve corneal anatomy and integrity; 2) clinical, in order to eliminate the inflamed corneal tissue in cases refractory to medical treatment; 3) optical, in order to improve visual acuity; and 4) cosmetic, in order to improve the appearance of the eye. Improvements in technique and instruments, as well as in post-operative treatment and the means of preserving donated tissue, have improved survival of the grafts. The Pamplona Model of transplant coordination of the Virgen del Camino Hospital is considered to be original and unique in Spain. The logistics of this program include the protocol for detection and extraction of corneas as well as for keratoplasties.

  19. The potential of 3D printing in urological research and patient care.

    PubMed

    Colaco, Marc; Igel, Daniel A; Atala, Anthony

    2018-04-01

    3D printing is an evolving technology that enables the creation of unique organic and inorganic structures with high precision. In urology, the technology has demonstrated potential uses in both patient and clinician education as well as in clinical practice. The four major techniques used for 3D printing are inkjet printing, extrusion printing, laser sintering, and stereolithography. Each of these techniques can be applied to the production of models for education and surgical planning, prosthetic construction, and tissue bioengineering. Bioengineering is potentially the most important application of 3D printing, as the ability to produce functional organic constructs might, in the future, enable urologists to replicate and replace abnormal tissues with neo-organs, improving patient survival and quality of life.

  20. Requirements for facilities and measurement techniques to support CFD development for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Sellers, William L., III; Dwoyer, Douglas L.

    1992-01-01

    The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.

  1. Gingival Mesenchymal Stem/Progenitor Cells: A Unique Tissue Engineering Gem

    PubMed Central

    Fawzy El-Sayed, Karim M.; Dörfer, Christof E.

    2016-01-01

    The human gingiva, characterized by its outstanding scarless wound healing properties, is a unique tissue and a pivotal component of the periodontal apparatus, investing and surrounding the teeth in their sockets in the alveolar bone. In the last years gingival mesenchymal stem/progenitor cells (G-MSCs), with promising regenerative and immunomodulatory properties, have been isolated and characterized from the gingival lamina propria. These cells, in contrast to other mesenchymal stem/progenitor cell sources, are abundant, readily accessible, and easily obtainable via minimally invasive cell isolation techniques. The present review summarizes the current scientific evidence on G-MSCs' isolation, their characterization, the investigated subpopulations, the generated induced pluripotent stem cells- (iPSC-) like G-MSCs, their regenerative properties, and current approaches for G-MSCs' delivery. The review further demonstrates their immunomodulatory properties, the transplantation preconditioning attempts via multiple biomolecules to enhance their attributes, and the experimental therapeutic applications conducted to treat multiple diseases in experimental animal models in vivo. G-MSCs show remarkable tissue reparative/regenerative potential, noteworthy immunomodulatory properties, and primary experimental therapeutic applications of G-MSCs are very promising, pointing at future biologically based therapeutic techniques, being potentially superior to conventional clinical treatment modalities. PMID:27313628

  2. Molecular dynamics reveal BCR-ABL1 polymutants as a unique mechanism of resistance to PAN-BCR-ABL1 kinase inhibitor therapy

    PubMed Central

    Gibbons, Don L.; Pricl, Sabrina; Posocco, Paola; Laurini, Erik; Fermeglia, Maurizio; Sun, Hanshi; Talpaz, Moshe; Donato, Nicholas; Quintás-Cardama, Alfonso

    2014-01-01

    The acquisition of mutations within the BCR-ABL1 kinase domain is frequently associated with tyrosine kinase inhibitor (TKI) failure in chronic myeloid leukemia. Sensitive sequencing techniques have revealed a high prevalence of compound BCR-ABL1 mutations (polymutants) in patients failing TKI therapy. To investigate the molecular consequences of such complex mutant proteins with regards to TKI resistance, we determined by cloning techniques the presence of polymutants in a cohort of chronic-phase patients receiving imatinib followed by dasatinib therapy. The analysis revealed a high frequency of polymutant BCR-ABL1 alleles even after failure of frontline imatinib, and also the progressive exhaustion of the pool of unmutated BCR-ABL1 alleles over the course of sequential TKI therapy. Molecular dynamics analyses of the most frequent polymutants in complex with TKIs revealed the basis of TKI resistance. Modeling of BCR-ABL1 in complex with the potent pan-BCR-ABL1 TKI ponatinib highlighted potentially effective therapeutic strategies for patients carrying these recalcitrant and complex BCR-ABL1 mutant proteins while unveiling unique mechanisms of escape to ponatinib therapy. PMID:24550512

  3. Microscale patterning of thermoplastic polymer surfaces by selective solvent swelling.

    PubMed

    Rahmanian, Omid; Chen, Chien-Fu; DeVoe, Don L

    2012-09-04

    A new method for the fabrication of microscale features in thermoplastic substrates is presented. Unlike traditional thermoplastic microfabrication techniques, in which bulk polymer is displaced from the substrate by machining or embossing, a unique process termed orogenic microfabrication has been developed in which selected regions of a thermoplastic surface are raised from the substrate by an irreversible solvent swelling mechanism. The orogenic technique allows thermoplastic surfaces to be patterned using a variety of masking methods, resulting in three-dimensional features that would be difficult to achieve through traditional microfabrication methods. Using cyclic olefin copolymer as a model thermoplastic material, several variations of this process are described to realize growth heights ranging from several nanometers to tens of micrometers, with patterning techniques include direct photoresist masking, patterned UV/ozone surface passivation, elastomeric stamping, and noncontact spotting. Orogenic microfabrication is also demonstrated by direct inkjet printing as a facile photolithography-free masking method for rapid desktop thermoplastic microfabrication.

  4. Freely Suspended Two-Dimensional Electron Gases.

    NASA Astrophysics Data System (ADS)

    Blick, Robert; Monzon, Franklin; Roukes, Michael; Wegscheider, Werner; Stern, Frank

    1998-03-01

    We present a new technique that has allowed us to build the first freely suspended two-dimensional electron gas devices from AlGaAs/GaAs/AlAs heterostructures. This technique is based upon specially MBE grown structures that include a sacrificial layer. In order to design the MBE layer sequence, the conduction band lineup for these samples was modelled numerically. The overall focus of this work is to provide a new approach for studies of the quantum mechanical properties of nanomachined structures. Our current experiments are directed toward use of these techniques for research on very high frequency nanomechanical resonators. The high mobility 2DEG system provides a unique approach to realizing wideband, extremely sensitive displacement detection, using the piezoelectric properties of GaAs to modulate a suspended nanometer-scale HEMT. This approach offers promise for sensitive displacement detectors with sub-nanometer resolution and bandwidths into the microwave range.

  5. Fusion of 3D models derived from TLS and image-based techniques for CH enhanced documentation

    NASA Astrophysics Data System (ADS)

    Bastonero, P.; Donadio, E.; Chiabrando, F.; Spanò, A.

    2014-05-01

    Recognizing the various advantages offered by 3D new metric survey technologies in the Cultural Heritage documentation phase, this paper presents some tests of 3D model generation, using different methods, and their possible fusion. With the aim to define potentialities and problems deriving from integration or fusion of metric data acquired with different survey techniques, the elected test case is an outstanding Cultural Heritage item, presenting both widespread and specific complexities connected to the conservation of historical buildings. The site is the Staffarda Abbey, the most relevant evidence of medieval architecture in Piedmont. This application faced one of the most topical architectural issues consisting in the opportunity to study and analyze an object as a whole, from twice location of acquisition sensors, both the terrestrial and the aerial one. In particular, the work consists in the evaluation of chances deriving from a simple union or from the fusion of different 3D cloudmodels of the abbey, achieved by multi-sensor techniques. The aerial survey is based on a photogrammetric RPAS (Remotely piloted aircraft system) flight while the terrestrial acquisition have been fulfilled by laser scanning survey. Both techniques allowed to extract and process different point clouds and to generate consequent 3D continuous models which are characterized by different scale, that is to say different resolutions and diverse contents of details and precisions. Starting from these models, the proposed process, applied to a sample area of the building, aimed to test the generation of a unique 3Dmodel thorough a fusion of different sensor point clouds. Surely, the describing potential and the metric and thematic gains feasible by the final model exceeded those offered by the two detached models.

  6. Helmet and shoulder pad removal in football players with unstable cervical spine injuries.

    PubMed

    Dahl, Michael C; Ananthakrishnan, Dheera; Nicandri, Gregg; Chapman, Jens R; Ching, Randal P

    2009-05-01

    Football, one of the country's most popular team sports, is associated with the largest overall number of sports-related, catastrophic, cervical spine injuries in the United States (Mueller, 2007). Patient handling can be hindered by the protective sports equipment worn by the athlete. Improper stabilization of these patients can exacerbate neurologic injury. Because of the lack of consensus on the best method for equipment removal, a study was performed comparing three techniques: full body levitation, upper torso tilt, and log roll. These techniques were performed on an intact and lesioned cervical spine cadaveric model simulating conditions in the emergency department. The levitation technique was found to produce motion in the anterior and right lateral directions. The tilt technique resulted in motions in the posterior left lateral directions, and the log roll technique generated motions in the right lateral direction and had the largest amount of increased instability when comparing the intact and lesioned specimen. These findings suggest that each method of equipment removal displays unique weaknesses that the practitioner should take into account, possibly on a patient-by-patient basis.

  7. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  8. Planarian brain regeneration as a model system for developmental neurotoxicology

    PubMed Central

    Hagstrom, Danielle; Cochet‐Escartin, Olivier

    2016-01-01

    Abstract Freshwater planarians, famous for their regenerative prowess, have long been recognized as a valuable in vivo animal model to study the effects of chemical exposure. In this review, we summarize the current techniques and tools used in the literature to assess toxicity in the planarian system. We focus on the planarian's particular amenability for neurotoxicology and neuroregeneration studies, owing to the planarian's unique ability to regenerate a centralized nervous system. Zooming in from the organismal to the molecular level, we show that planarians offer a repertoire of morphological and behavioral readouts while also being amenable to mechanistic studies of compound toxicity. Finally, we discuss the open challenges and opportunities for planarian brain regeneration to become an important model system for modern toxicology. PMID:27499880

  9. Unique spectral signatures of the nucleic acid dye acridine orange can distinguish cell death by apoptosis and necroptosis

    PubMed Central

    Caprariello, Andrew V.; Henry, Tyler J.; Tsutsui, Shigeki; Chu, Tak H.; Schenk, Geert J.; Yong, V. Wee

    2017-01-01

    Cellular injury and death are ubiquitous features of disease, yet tools to detect them are limited and insensitive to subtle pathological changes. Acridine orange (AO), a nucleic acid dye with unique spectral properties, enables real-time measurement of RNA and DNA as proxies for cell viability during exposure to various noxious stimuli. This tool illuminates spectral signatures unique to various modes of cell death, such as cells undergoing apoptosis versus necrosis/necroptosis. This new approach also shows that cellular RNA decreases during necrotic, necroptotic, and apoptotic cell death caused by demyelinating, ischemic, and traumatic injuries, implying its involvement in a wide spectrum of tissue pathologies. Furthermore, cells with pathologically low levels of cytoplasmic RNA are detected earlier and in higher numbers than with standard markers including TdT-mediated dUTP biotin nick-end labeling and cleaved caspase 3 immunofluorescence. Our technique highlights AO-labeled cytoplasmic RNA as an important early marker of cellular injury and a sensitive indicator of various modes of cell death in a range of experimental models. PMID:28264914

  10. Extending existing structural identifiability analysis methods to mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Engineering 3D Models of Tumors and Bone to Understand Tumor-Induced Bone Disease and Improve Treatments.

    PubMed

    Kwakwa, Kristin A; Vanderburgh, Joseph P; Guelcher, Scott A; Sterling, Julie A

    2017-08-01

    Bone is a structurally unique microenvironment that presents many challenges for the development of 3D models for studying bone physiology and diseases, including cancer. As researchers continue to investigate the interactions within the bone microenvironment, the development of 3D models of bone has become critical. 3D models have been developed that replicate some properties of bone, but have not fully reproduced the complex structural and cellular composition of the bone microenvironment. This review will discuss 3D models including polyurethane, silk, and collagen scaffolds that have been developed to study tumor-induced bone disease. In addition, we discuss 3D printing techniques used to better replicate the structure of bone. 3D models that better replicate the bone microenvironment will help researchers better understand the dynamic interactions between tumors and the bone microenvironment, ultimately leading to better models for testing therapeutics and predicting patient outcomes.

  12. Sprouting Buds of Zebrafish Research in Malaysia: First Malaysia Zebrafish Disease Model Workshop.

    PubMed

    Okuda, Kazuhide Shaun; Tan, Pei Jean; Patel, Vyomesh

    2016-04-01

    Zebrafish is gaining prominence as an important vertebrate model for investigating various human diseases. Zebrafish provides unique advantages such as optical clarity of embryos, high fecundity rate, and low cost of maintenance, making it a perfect complement to the murine model equivalent in biomedical research. Due to these advantages, researchers in Malaysia are starting to take notice and incorporate the zebrafish model into their research activities. However, zebrafish research in Malaysia is still in its infancy stage and many researchers still remain unaware of the full potential of the zebrafish model or have limited access to related tools and techniques that are widely utilized in many zebrafish laboratories worldwide. To overcome this, we organized the First Malaysia Zebrafish Disease Model Workshop in Malaysia that took place on 11th and 12th of November 2015. In this workshop, we showcased how the zebrafish model is being utilized in the biomedical field in international settings as well as in Malaysia. For this, notable international speakers and those from local universities known to be carrying out impactful research using zebrafish were invited to share some of the cutting edge techniques that are used in their laboratories that may one day be incorporated in the Malaysian scientific community.

  13. Three-dimensional light-tissue interaction models for bioluminescence tomography

    NASA Astrophysics Data System (ADS)

    Côté, D.; Allard, M.; Henkelman, R. M.; Vitkin, I. A.

    2005-09-01

    Many diagnostic and therapeutic approaches in medical physics today take advantage of the unique properties of light and its interaction with tissues. Because light scatters in tissue, our ability to develop these techniques depends critically on our knowledge of the distribution of light in tissue. Solutions to the diffusion equation can provide such information, but often lack the flexibility required for more general problems that involve, for instance, inhomogeneous optical properties, light polarization, arbitrary three-dimensional geometries, or arbitrary scattering. Monte Carlo techniques, which statistically sample the light distribution in tissue, offer a better alternative to analytical models. First, we discuss our implementation of a validated three-dimensional polarization-sensitive Monte Carlo algorithm and demonstrate its generality with respect to the geometry and scattering models it can treat. Second, we apply our model to bioluminescence tomography. After appropriate genetic modifications to cell lines, bioluminescence can be used as an indicator of cell activity, and is often used to study tumour growth and treatment in animal models. However, the amount of light escaping the animal is strongly dependent on the position and size of the tumour. Using forward models and structural data from magnetic resonance imaging, we show how the models can help to determine the location and size of tumour made of bioluminescent cancer cells in the brain of a mouse.

  14. The Silicon Trypanosome: a test case of iterative model extension in systems biology

    PubMed Central

    Achcar, Fiona; Fadda, Abeer; Haanstra, Jurgen R.; Kerkhoven, Eduard J.; Kim, Dong-Hyun; Leroux, Alejandro E.; Papamarkou, Theodore; Rojas, Federico; Bakker, Barbara M.; Barrett, Michael P.; Clayton, Christine; Girolami, Mark; Luise Krauth-Siegel, R.; Matthews, Keith R.; Breitling, Rainer

    2016-01-01

    The African trypanosome, Trypanosoma brucei, is a unicellular parasite causing African Trypanosomiasis (sleeping sickness in humans and nagana in animals). Due to some of its unique properties, it has emerged as a popular model organism in systems biology. A predictive quantitative model of glycolysis in the bloodstream form of the parasite has been constructed and updated several times. The Silicon Trypanosome (SilicoTryp) is a project that brings together modellers and experimentalists to improve and extend this core model with new pathways and additional levels of regulation. These new extensions and analyses use computational methods that explicitly take different levels of uncertainty into account. During this project, numerous tools and techniques have been developed for this purpose, which can now be used for a wide range of different studies in systems biology. PMID:24797926

  15. The practice of agent-based model visualization.

    PubMed

    Dorin, Alan; Geard, Nicholas

    2014-01-01

    We discuss approaches to agent-based model visualization. Agent-based modeling has its own requirements for visualization, some shared with other forms of simulation software, and some unique to this approach. In particular, agent-based models are typified by complexity, dynamism, nonequilibrium and transient behavior, heterogeneity, and a researcher's interest in both individual- and aggregate-level behavior. These are all traits requiring careful consideration in the design, experimentation, and communication of results. In the case of all but final communication for dissemination, researchers may not make their visualizations public. Hence, the knowledge of how to visualize during these earlier stages is unavailable to the research community in a readily accessible form. Here we explore means by which all phases of agent-based modeling can benefit from visualization, and we provide examples from the available literature and online sources to illustrate key stages and techniques.

  16. Tailoring Modified Moore Method Techniques to Liberal Arts Mathematics Courses

    ERIC Educational Resources Information Center

    Hitchman, Theron J.; Shaw, Douglas

    2015-01-01

    Inquiry-based learning (IBL) techniques can be used in mathematics courses for non-majors, such as courses required for liberal arts majors to fulfill graduation requirements. Unique challenges are discussed, followed by adaptations of IBL techniques to overcome those challenges.

  17. Long-term retrospective analysis of mackerel spawning in the North Sea: a new time series and modeling approach to CPR data.

    PubMed

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.

  18. Long-Term Retrospective Analysis of Mackerel Spawning in the North Sea: A New Time Series and Modeling Approach to CPR Data

    PubMed Central

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221

  19. Confidence set inference with a prior quadratic bound

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1989-01-01

    In the uniqueness part of a geophysical inverse problem, the observer wants to predict all likely values of P unknown numerical properties z=(z sub 1,...,z sub p) of the earth from measurement of D other numerical properties y (sup 0) = (y (sub 1) (sup 0), ..., y (sub D (sup 0)), using full or partial knowledge of the statistical distribution of the random errors in y (sup 0). The data space Y containing y(sup 0) is D-dimensional, so when the model space X is infinite-dimensional the linear uniqueness problem usually is insoluble without prior information about the correct earth model x. If that information is a quadratic bound on x, Bayesian inference (BI) and stochastic inversion (SI) inject spurious structure into x, implied by neither the data nor the quadratic bound. Confidence set inference (CSI) provides an alternative inversion technique free of this objection. Confidence set inference is illustrated in the problem of estimating the geomagnetic field B at the core-mantle boundary (CMB) from components of B measured on or above the earth's surface.

  20. Uniqueness of complete maximal hypersurfaces in spatially parabolic generalized Robertson-Walker spacetimes

    NASA Astrophysics Data System (ADS)

    Romero, Alfonso; Rubio, Rafael M.; Salamanca, Juan J.

    2013-06-01

    A new technique for the study of noncompact complete spacelike hypersurfaces in generalized Robertson-Walker (GRW) spacetimes whose fiber is a parabolic Riemannian manifold is introduced. This class of spacetimes allows us to model open universes which extend to spacelike closed GRW spacetimes from the viewpoint of the geometric analysis of the fiber, and which, unlike those spacetimes, could be compatible with the holographic principle. First, under reasonable assumptions on the restriction of the warping function to the spacelike hypersurface and on the hyperbolic angle between the unit normal vector field and a certain timelike vector field, a complete spacelike hypersurface in a spatially parabolic GRW spacetime is shown to be parabolic, and the existence of a simply connected parabolic spacelike hypersurface in a GRW spacetime also leads to the parabolicity of its fiber. Then, all the complete maximal hypersurfaces in spatially parabolic GRW spacetimes are determined in several cases, extending, in particular, to this family of open cosmological models several well-known uniqueness results for the case of spatially closed GRW spacetimes. Moreover, new Calabi-Bernstein problems are solved.

  1. Unique considerations in the design and experimental evaluation of tailored wings with elastically produced chordwise camber

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.; Zischka, Peter J.; Fentress, Michael L.; Chang, Stephen

    1992-01-01

    Some of the unique considerations that are associated with the design and experimental evaluation of chordwise deformable wing structures are addressed. Since chordwise elastic camber deformations are desired and must be free to develop, traditional rib concepts and experimental methodology cannot be used. New rib design concepts are presented and discussed. An experimental methodology based upon the use of a flexible sling support and load application system has been created and utilized to evaluate a model box beam experimentally. Experimental data correlate extremely well with design analysis predictions based upon a beam model for the global properties of camber compliance and spanwise bending compliance. Local strain measurements exhibit trends in agreement with intuition and theory but depart slightly from theoretical perfection based upon beam-like behavior alone. It is conjectured that some additional refinement of experimental technique is needed to explain or eliminate these (minor) departures from asymmetric behavior of upper and lower box cover strains. Overall, a solid basis for the design of box structures based upon the bending method of elastic camber production has been confirmed by the experiments.

  2. Applications of Computational Methods for Dynamic Stability and Control Derivatives

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Spence, Angela M.

    2004-01-01

    Initial steps in the application o f a low-order panel method computational fluid dynamic (CFD) code to the calculation of aircraft dynamic stability and control (S&C) derivatives are documented. Several capabilities, unique to CFD but not unique to this particular demonstration, are identified and demonstrated in this paper. These unique capabilities complement conventional S&C techniques and they include the ability to: 1) perform maneuvers without the flow-kinematic restrictions and support interference commonly associated with experimental S&C facilities, 2) easily simulate advanced S&C testing techniques, 3) compute exact S&C derivatives with uncertainty propagation bounds, and 4) alter the flow physics associated with a particular testing technique from those observed in a wind or water tunnel test in order to isolate effects. Also presented are discussions about some computational issues associated with the simulation of S&C tests and selected results from numerous surface grid resolution studies performed during the course of the study.

  3. Orion Exploration Flight Test Post-Flight Inspection and Analysis

    NASA Technical Reports Server (NTRS)

    Miller, J. E.; Berger, E. L.; Bohl, W. E.; Christiansen, E. L.; Davis, B. A.; Deighton, K. D.; Enriquez, P. A.; Garcia, M. A.; Hyde, J. L.; Oliveras, O. M.

    2017-01-01

    The principal mechanism for developing orbital debris environment models, is to make observations of larger pieces of debris in the range of several centimeters and greater using radar and optical techniques. For particles that are smaller than this threshold, breakup and migration models of particles to returned surfaces in lower orbit are relied upon to quantify the flux. This reliance on models to derive spatial densities of particles that are of critical importance to spacecraft make the unique nature of the EFT-1's return surface a valuable metric. To this end detailed post-flight inspections have been performed of the returned EFT-1 backshell, and the inspections identified six candidate impact sites that were not present during the pre-flight inspections. This paper describes the post-flight analysis efforts to characterize the EFT-1 mission craters. This effort included ground based testing to understand small particle impact craters in the thermal protection material, the pre- and post-flight inspection, the crater analysis using optical, X-ray computed tomography (CT) and scanning electron microscope (SEM) techniques, and numerical simulations.

  4. In situ strain and temperature measurement and modelling during arc welding

    DOE PAGES

    Chen, Jian; Yu, Xinghua; Miller, Roger G.; ...

    2014-12-26

    In this study, experiments and numerical models were applied to investigate the thermal and mechanical behaviours of materials adjacent to the weld pool during arc welding. In the experiment, a new high temperature strain measurement technique based on digital image correlation (DIC) was developed and applied to measure the in situ strain evolution. In contrast to the conventional DIC method that is vulnerable to the high temperature and intense arc light involved in fusion welding processes, the new technique utilised a special surface preparation method to produce high temperature sustaining speckle patterns required by the DIC algorithm as well asmore » a unique optical illumination and filtering system to suppress the influence of the intense arc light. These efforts made it possible for the first time to measure in situ the strain field 1 mm away from the fusion line. The temperature evolution in the weld and the adjacent regions was simultaneously monitored by an infrared camera. Finally and additionally, a thermal–mechanical finite element model was applied to substantiate the experimental measurement.« less

  5. Comparison on three classification techniques for sex estimation from the bone length of Asian children below 19 years old: an analysis using different group of ages.

    PubMed

    Darmawan, M F; Yusuf, Suhaila M; Kadir, M R Abdul; Haron, H

    2015-02-01

    Sex estimation is used in forensic anthropology to assist the identification of individual remains. However, the estimation techniques tend to be unique and applicable only to a certain population. This paper analyzed sex estimation on living individual child below 19 years old using the length of 19 bones of left hand applied for three classification techniques, which were Discriminant Function Analysis (DFA), Support Vector Machine (SVM) and Artificial Neural Network (ANN) multilayer perceptron. These techniques were carried out on X-ray images of the left hand taken from an Asian population data set. All the 19 bones of the left hand were measured using Free Image software, and all the techniques were performed using MATLAB. The group of age "16-19" years old and "7-9" years old were the groups that could be used for sex estimation with as their average of accuracy percentage was above 80%. ANN model was the best classification technique with the highest average of accuracy percentage in the two groups of age compared to other classification techniques. The results show that each classification technique has the best accuracy percentage on each different group of age. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Order parameter re-mapping algorithm for 3D phase field model of grain growth using FEM

    DOE PAGES

    Permann, Cody J.; Tonks, Michael R.; Fromm, Bradley; ...

    2016-01-14

    Phase field modeling (PFM) is a well-known technique for simulating microstructural evolution. To model grain growth using PFM, typically each grain is assigned a unique non-conserved order parameter and each order parameter field is evolved in time. Traditional approaches using a one-to-one mapping of grains to order parameters present a challenge when modeling large numbers of grains due to the computational expense of using many order parameters. This problem is exacerbated when using an implicit finite element method (FEM), as the global matrix size is proportional to the number of order parameters. While previous work has developed methods to reducemore » the number of required variables and thus computational complexity and run time, none of the existing approaches can be applied for an implicit FEM implementation of PFM. Here, we present a modular, dynamic, scalable reassignment algorithm suitable for use in such a system. Polycrystal modeling with grain growth and stress require careful tracking of each grain’s position and orientation which is lost when using a reduced order parameter set. In conclusion, the method presented in this paper maintains a unique ID for each grain even after reassignment, to allow the PFM to be tightly coupled to calculations of the stress throughout the polycrystal. Implementation details and comparative results of our approach are presented.« less

  7. Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    Information theoretic measures can be used to identify nonlinear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1 min environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.

  8. Utilising three-dimensional printing techniques when providing unique assistive devices: A case report.

    PubMed

    Day, Sarah Jane; Riley, Shaun Patrick

    2018-02-01

    The evolution of three-dimensional printing into prosthetics has opened conversations about the availability and cost of prostheses. This report will discuss how a prosthetic team incorporated additive manufacture techniques into the treatment of a patient with a partial hand amputation to create and test a unique assistive device which he could use to hold his French horn. Case description and methods: Using a process of shape capture, photogrammetry, computer-aided design and finite element analysis, a suitable assistive device was designed and tested. The design was fabricated using three-dimensional printing. Patient satisfaction was measured using a Pugh's Matrix™, and a cost comparison was made between the process used and traditional manufacturing. Findings and outcomes: Patient satisfaction was high. The three-dimensional printed devices were 56% cheaper to fabricate than a similar laminated device. Computer-aided design and three-dimensional printing proved to be an effective method for designing, testing and fabricating a unique assistive device. Clinical relevance CAD and 3D printing techniques can enable devices to be designed, tested and fabricated cheaper than when using traditional techniques. This may lead to improvements in quality and accessibility.

  9. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    1998-11-04

    An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less

  10. Communication and cooperation in underwater acoustic networks

    NASA Astrophysics Data System (ADS)

    Yerramalli, Srinivas

    In this thesis, we present a study of several problems related to underwater point to point communications and network formation. We explore techniques to improve the achievable data rate on a point to point link using better physical layer techniques and then study sensor cooperation which improves the throughput and reliability in an underwater network. Robust point-to-point communications in underwater networks has become increasingly critical in several military and civilian applications related to underwater communications. We present several physical layer signaling and detection techniques tailored to the underwater channel model to improve the reliability of data detection. First, a simplified underwater channel model in which the time scale distortion on each path is assumed to be the same (single scale channel model in contrast to a more general multi scale model). A novel technique, which exploits the nature of OFDM signaling and the time scale distortion, called Partial FFT Demodulation is derived. It is observed that this new technique has some unique interference suppression properties and performs better than traditional equalizers in several scenarios of interest. Next, we consider the multi scale model for the underwater channel and assume that single scale processing is performed at the receiver. We then derive optimized front end pre-processing techniques to reduce the interference caused during single scale processing of signals transmitted on a multi-scale channel. We then propose an improvised channel estimation technique using dictionary optimization methods for compressive sensing and show that significant performance gains can be obtained using this technique. In the next part of this thesis, we consider the problem of sensor node cooperation among rational nodes whose objective is to improve their individual data rates. We first consider the problem of transmitter cooperation in a multiple access channel and investigate the stability of the grand coalition of transmitters using tools from cooperative game theory and show that the grand coalition in both the asymptotic regimes of high and low SNR. Towards studying the problem of receiver cooperation for a broadcast channel, we propose a game theoretic model for the broadcast channel and then derive a game theoretic duality between the multiple access and the broadcast channel and show that how the equilibria of the broadcast channel are related to the multiple access channel and vice versa.

  11. Unified model of brain tissue microstructure dynamically binds diffusion and osmosis with extracellular space geometry

    NASA Astrophysics Data System (ADS)

    Yousefnezhad, Mohsen; Fotouhi, Morteza; Vejdani, Kaveh; Kamali-Zare, Padideh

    2016-09-01

    We present a universal model of brain tissue microstructure that dynamically links osmosis and diffusion with geometrical parameters of brain extracellular space (ECS). Our model robustly describes and predicts the nonlinear time dependency of tortuosity (λ =√{D /D* } ) changes with very high precision in various media with uniform and nonuniform osmolarity distribution, as demonstrated by previously published experimental data (D = free diffusion coefficient, D* = effective diffusion coefficient). To construct this model, we first developed a multiscale technique for computationally effective modeling of osmolarity in the brain tissue. Osmolarity differences across cell membranes lead to changes in the ECS dynamics. The evolution of the underlying dynamics is then captured by a level set method. Subsequently, using a homogenization technique, we derived a coarse-grained model with parameters that are explicitly related to the geometry of cells and their associated ECS. Our modeling results in very accurate analytical approximation of tortuosity based on time, space, osmolarity differences across cell membranes, and water permeability of cell membranes. Our model provides a unique platform for studying ECS dynamics not only in physiologic conditions such as sleep-wake cycles and aging but also in pathologic conditions such as stroke, seizure, and neoplasia, as well as in predictive pharmacokinetic modeling such as predicting medication biodistribution and efficacy and novel biomolecule development and testing.

  12. Combined Monte Carlo and path-integral method for simulated library of time-resolved reflectance curves from layered tissue models

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann

    2009-02-01

    Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.

  13. Texture-based characterization of subskin features by specified laser speckle effects at λ = 650 nm region for more accurate parametric 'skin age' modelling.

    PubMed

    Orun, A B; Seker, H; Uslan, V; Goodyer, E; Smith, G

    2017-06-01

    The textural structure of 'skin age'-related subskin components enables us to identify and analyse their unique characteristics, thus making substantial progress towards establishing an accurate skin age model. This is achieved by a two-stage process. First by the application of textural analysis using laser speckle imaging, which is sensitive to textural effects within the λ = 650 nm spectral band region. In the second stage, a Bayesian inference method is used to select attributes from which a predictive model is built. This technique enables us to contrast different skin age models, such as the laser speckle effect against the more widely used normal light (LED) imaging method, whereby it is shown that our laser speckle-based technique yields better results. The method introduced here is non-invasive, low cost and capable of operating in real time; having the potential to compete against high-cost instrumentation such as confocal microscopy or similar imaging devices used for skin age identification purposes. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  14. Generation of the first structure-based pharmacophore model containing a selective "zinc binding group" feature to identify potential glyoxalase-1 inhibitors.

    PubMed

    Al-Balas, Qosay; Hassan, Mohammad; Al-Oudat, Buthina; Alzoubi, Hassan; Mhaidat, Nizar; Almaaytah, Ammar

    2012-11-22

    Within this study, a unique 3D structure-based pharmacophore model of the enzyme glyoxalase-1 (Glo-1) has been revealed. Glo-1 is considered a zinc metalloenzyme in which the inhibitor binding with zinc atom at the active site is crucial. To our knowledge, this is the first pharmacophore model that has a selective feature for a "zinc binding group" which has been customized within the structure-based pharmacophore model of Glo-1 to extract ligands that possess functional groups able to bind zinc atom solely from database screening. In addition, an extensive 2D similarity search using three diverse similarity techniques (Tanimoto, Dice, Cosine) has been performed over the commercially available "Zinc Clean Drug-Like Database" that contains around 10 million compounds to help find suitable inhibitors for this enzyme based on known inhibitors from the literature. The resultant hits were mapped over the structure based pharmacophore and the successful hits were further docked using three docking programs with different pose fitting and scoring techniques (GOLD, LibDock, CDOCKER). Nine candidates were suggested to be novel Glo-1 inhibitors containing the "zinc binding group" with the highest consensus scoring from docking.

  15. What Can Be Learned from Nuclear Resonance Vibrational Spectroscopy: Vibrational Dynamics and Hemes

    PubMed Central

    2017-01-01

    Nuclear resonance vibrational spectroscopy (NRVS; also known as nuclear inelastic scattering, NIS) is a synchrotron-based method that reveals the full spectrum of vibrational dynamics for Mössbauer nuclei. Another major advantage, in addition to its completeness (no arbitrary optical selection rules), is the unique selectivity of NRVS. The basics of this recently developed technique are first introduced with descriptions of the experimental requirements and data analysis including the details of mode assignments. We discuss the use of NRVS to probe 57Fe at the center of heme and heme protein derivatives yielding the vibrational density of states for the iron. The application to derivatives with diatomic ligands (O2, NO, CO, CN–) shows the strong capabilities of identifying mode character. The availability of the complete vibrational spectrum of iron allows the identification of modes not available by other techniques. This permits the correlation of frequency with other physical properties. A significant example is the correlation we find between the Fe–Im stretch in six-coordinate Fe(XO) hemes and the trans Fe–N(Im) bond distance, not possible previously. NRVS also provides uniquely quantitative insight into the dynamics of the iron. For example, it provides a model-independent means of characterizing the strength of iron coordination. Prediction of the temperature-dependent mean-squared displacement from NRVS measurements yields a vibrational “baseline” for Fe dynamics that can be compared with results from techniques that probe longer time scales to yield quantitative insights into additional dynamical processes. PMID:28921972

  16. Making the Nanoworld Accessible: Nanoscience Education Using Scanning Probe Methods

    NASA Astrophysics Data System (ADS)

    Knorr, Daniel; Killgore, Jason; Gray, Tomoko; Ginger, David; Wei, Joseph; Chen, Yeechi; Sarikaya, Mehmet; Fong, Hanson; Griffith, Tom; Overney, Rene

    2008-03-01

    A partnership between researchers and educators at the University of Washington, North Seattle Community College and two companies, Nanosurf, AG and nanoScience Instruments has been forged to develop a nationally replicable model of a sustainable and up-to-date undergraduate teaching laboratory of scanning probe microscopy (SPM) methods applied to nanoscience and nanotechnology. Within this partnership a new paradigm of operating and maintaining a SPM laboratory has been developed that provides a truly hands-on experience in a classroom laboratory setting with a small student to instrument ratio involving a variety of SPM techniques and topics. To date, we have run a first successful undergraduate laboratory workshop, where students were able to have extensive hands-on experience on five SPM modes of operation including: electrostatic force microscopy involving photovoltaic polymeric materials, tunneling microscopy and the determination of the workfunction, and nanolithography using the dip-pen method. http://depts.washington.edu/nanolab/NUE/UNIQUE/NUE/UNIQUE.htm

  17. Overview of Engineering Design and Analysis at the NASA John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Ryan, Harry; Congiardo, Jared; Junell, Justin; Kirkpatrick, Richard

    2007-01-01

    A wide range of rocket propulsion test work occurs at the NASA John C. Stennis Space Center (SSC) including full-scale engine test activities at test facilities A-1, A-2, B-1 and B-2 as well as combustion device research and development activities at the E-Complex (E-1, E-2, E-3 and E-4) test facilities. The propulsion test engineer at NASA SSC faces many challenges associated with designing and operating a test facility due to the extreme operating conditions (e.g., cryogenic temperatures, high pressures) of the various system components and the uniqueness of many of the components and systems. The purpose of this paper is to briefly describe the NASA SSC Engineering Science Directorate s design and analysis processes, experience, and modeling techniques that are used to design and support the operation of unique rocket propulsion test facilities.

  18. An Overview of Unsteady Pressure Measurements in the Transonic Dynamics Tunnel

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Edwards, John W.; Bennett, Robert M.

    2000-01-01

    The NASA Langley Transonic Dynamics Tunnel has served as a unique national facility for aeroelastic testing for over forty years. A significant portion of this testing has been to measure unsteady pressures on models undergoing flutter, forced oscillations, or buffet. These tests have ranged from early launch vehicle buffet to flutter of a generic high-speed transport. This paper will highlight some of the test techniques, model design approaches, and the many unsteady pressure tests conducted in the TDT. The objectives and results of the data acquired during these tests will be summarized for each case and a brief discussion of ongoing research involving unsteady pressure measurements and new TDT capabilities will be presented.

  19. MT+, integrating magnetotellurics to determine earth structure, physical state, and processes

    USGS Publications Warehouse

    Bedrosian, P.A.

    2007-01-01

    As one of the few deep-earth imaging techniques, magnetotellurics provides information on both the structure and physical state of the crust and upper mantle. Magnetotellurics is sensitive to electrical conductivity, which varies within the earth by many orders of magnitude and is modified by a range of earth processes. As with all geophysical techniques, magnetotellurics has a non-unique inverse problem and has limitations in resolution and sensitivity. As such, an integrated approach, either via the joint interpretation of independent geophysical models, or through the simultaneous inversion of independent data sets is valuable, and at times essential to an accurate interpretation. Magnetotelluric data and models are increasingly integrated with geological, geophysical and geochemical information. This review considers recent studies that illustrate the ways in which such information is combined, from qualitative comparisons to statistical correlation studies to multi-property inversions. Also emphasized are the range of problems addressed by these integrated approaches, and their value in elucidating earth structure, physical state, and processes. ?? Springer Science+Business Media B.V. 2007.

  20. Technical innovation: Intragastric Single Port Sleeve Gastrectomy (IGSG). A feasibility survival study on porcine model.

    PubMed

    Estupinam, Oscar; Oliveira, André Lacerda de Abreu; Antunes, Fernanda; Galvão, Manoel; Phillips, Henrique; Scheffer, Jussara Peters; Rios, Marcelo; Zorron, Ricardo

    2018-01-01

    To perform technically the laparoscopic sleeve gastrectomy (LSG) using a unique Intragastric Single Port (IGSG) in animal swine model, evidencing an effective and safe procedure, optimizing the conventional technique. IGSG was performed in 4 minipigs, using a percutaneous intragastric single port located in the pre-pyloric region. The gastric stapling of the greater curvature started from the pre-pyloric region towards the angle of His by Endo GIA™ system and the specimen was removed through the single port. In the postoperative day 30, the animals were sacrificed and submitted to autopsy. All procedures were performed without conversion, and all survived 30 days. The mean operative time was 42 min. During the perioperative period no complications were observed during invagination and stapling. No postoperative complications occurred. Post-mortem examination showed no leaks or infectious complications. Intragastric Single Port is a feasible procedure that may be a suitable alternative technique of sleeve gastrectomy for the treatment of morbid obesity.

  1. Predicting Structure-Function Relations and Survival following Surgical and Bronchoscopic Lung Volume Reduction Treatment of Emphysema.

    PubMed

    Mondoñedo, Jarred R; Suki, Béla

    2017-02-01

    Lung volume reduction surgery (LVRS) and bronchoscopic lung volume reduction (bLVR) are palliative treatments aimed at reducing hyperinflation in advanced emphysema. Previous work has evaluated functional improvements and survival advantage for these techniques, although their effects on the micromechanical environment in the lung have yet to be determined. Here, we introduce a computational model to simulate a force-based destruction of elastic networks representing emphysema progression, which we use to track the response to lung volume reduction via LVRS and bLVR. We find that (1) LVRS efficacy can be predicted based on pre-surgical network structure; (2) macroscopic functional improvements following bLVR are related to microscopic changes in mechanical force heterogeneity; and (3) both techniques improve aspects of survival and quality of life influenced by lung compliance, albeit while accelerating disease progression. Our model predictions yield unique insights into the microscopic origins underlying emphysema progression before and after lung volume reduction.

  2. Image reconstruction algorithms for electrical capacitance tomography based on ROF model using new numerical techniques

    NASA Astrophysics Data System (ADS)

    Chen, Jiaoxuan; Zhang, Maomao; Liu, Yinyan; Chen, Jiaoliao; Li, Yi

    2017-03-01

    Electrical capacitance tomography (ECT) is a promising technique applied in many fields. However, the solutions for ECT are not unique and highly sensitive to the measurement noise. To remain a good shape of reconstructed object and endure a noisy data, a Rudin-Osher-Fatemi (ROF) model with total variation regularization is applied to image reconstruction in ECT. Two numerical methods, which are simplified augmented Lagrangian (SAL) and accelerated alternating direction method of multipliers (AADMM), are innovatively introduced to try to solve the above mentioned problems in ECT. The effect of the parameters and the number of iterations for different algorithms, and the noise level in capacitance data are discussed. Both simulation and experimental tests were carried out to validate the feasibility of the proposed algorithms, compared to the Landweber iteration (LI) algorithm. The results show that the SAL and AADMM algorithms can handle a high level of noise and the AADMM algorithm outperforms other algorithms in identifying the object from its background.

  3. Measurements of strain at plate boundaries using space based geodetic techniques

    NASA Technical Reports Server (NTRS)

    Robaudo, Stefano; Harrison, Christopher G. A.

    1993-01-01

    We have used the space based geodetic techniques of Satellite Laser Ranging (SLR) and VLBI to study strain along subduction and transform plate boundaries and have interpreted the results using a simple elastic dislocation model. Six stations located behind island arcs were analyzed as representative of subduction zones while 13 sites located on either side of the San Andreas fault were used for the transcurrent zones. The length deformation scale was then calculated for both tectonic margins by fitting the relative strain to an exponentially decreasing function of distance from the plate boundary. Results show that space-based data for the transcurrent boundary along the San Andreas fault help to define better the deformation length scale in the area while fitting nicely the elastic half-space earth model. For subduction type bonndaries the analysis indicates that there is no single scale length which uniquely describes the deformation. This is mainly due to the difference in subduction characteristics for the different areas.

  4. Predicting Structure-Function Relations and Survival following Surgical and Bronchoscopic Lung Volume Reduction Treatment of Emphysema

    PubMed Central

    Mondoñedo, Jarred R.

    2017-01-01

    Lung volume reduction surgery (LVRS) and bronchoscopic lung volume reduction (bLVR) are palliative treatments aimed at reducing hyperinflation in advanced emphysema. Previous work has evaluated functional improvements and survival advantage for these techniques, although their effects on the micromechanical environment in the lung have yet to be determined. Here, we introduce a computational model to simulate a force-based destruction of elastic networks representing emphysema progression, which we use to track the response to lung volume reduction via LVRS and bLVR. We find that (1) LVRS efficacy can be predicted based on pre-surgical network structure; (2) macroscopic functional improvements following bLVR are related to microscopic changes in mechanical force heterogeneity; and (3) both techniques improve aspects of survival and quality of life influenced by lung compliance, albeit while accelerating disease progression. Our model predictions yield unique insights into the microscopic origins underlying emphysema progression before and after lung volume reduction. PMID:28182686

  5. The NASA Langley Isolator Dynamics Research Lab

    NASA Technical Reports Server (NTRS)

    Middleton, Troy F.; Balla, Robert J.; Baurle, Robert A.; Humphreys, William M.; Wilson, Lloyd G.

    2010-01-01

    The Isolator Dynamics Research Lab (IDRL) is under construction at the NASA Langley Research Center in Hampton, Virginia. A unique test apparatus is being fabricated to support both wall and in-stream measurements for investigating the internal flow of a dual-mode scramjet isolator model. The test section is 24 inches long with a 1-inch by 2-inch cross sectional area and is supplied with unheated, dry air through a Mach 2.5 converging-diverging nozzle. The test section is being fabricated with two sets (glass and metallic) of interchangeable sidewalls to support flow visualization and laser-based measurement techniques as well as static pressure, wall temperature, and high frequency pressure measurements. During 2010, a CFD code validation experiment will be conducted in the lab in support of NASA s Fundamental Aerodynamics Program. This paper describes the mechanical design of the Isolator Dynamics Research Lab test apparatus and presents a summary of the measurement techniques planned for investigating the internal flow field of a scramjet isolator model.

  6. Commercial Capaciflector

    NASA Technical Reports Server (NTRS)

    Vranish, John M.

    1991-01-01

    A capacitive proximity/tactile sensor with unique performance capabilities ('capaciflector' or capacitive reflector) is being developed by NASA/Goddard Space Flight Center (GSFC) for use on robots and payloads in space in the interests of safety, efficiency, and ease of operation. Specifically, this sensor will permit robots and their attached payloads to avoid collisions in space with humans and other objects and to dock these payloads in a cluttered environment. The sensor is simple, robust, and inexpensive to manufacture with obvious and recognized commercial possibilities. Accordingly, NASA/GSFC, in conjunction with industry, is embarking on an effort to 'spin' this technology off into the private sector. This effort includes prototypes aimed at commercial applications. The principles of operation of these prototypes are described along with hardware, software, modelling, and test results. The hardware description includes both the physical sensor in terms of a flexible printed circuit board and the electronic circuitry. The software description will include filtering and detection techniques. The modelling will involve finite element electric field analysis and will underline techniques used for design optimization.

  7. The Experimental Measurement of Aerodynamic Heating About Complex Shapes at Supersonic Mach Numbers

    NASA Technical Reports Server (NTRS)

    Neumann, Richard D.; Freeman, Delma C.

    2011-01-01

    In 2008 a wind tunnel test program was implemented to update the experimental data available for predicting protuberance heating at supersonic Mach numbers. For this test the Langley Unitary Wind Tunnel was also used. The significant differences for this current test were the advances in the state-of-the-art in model design, fabrication techniques, instrumentation and data acquisition capabilities. This current paper provides a focused discussion of the results of an in depth analysis of unique measurements of recovery temperature obtained during the test.

  8. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  9. Field Evaluation of Detection-Control System

    DOT National Transportation Integrated Search

    2014-10-01

    High-speed signalized intersections present unique challenges to improving highway safety. Techniques for achieving safety often have an adverse effect on efficiency, and techniques for achieving efficiency sometimes have an adverse effect on safety....

  10. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  11. Computational Modeling of Liquid and Gaseous Control Valves

    NASA Technical Reports Server (NTRS)

    Daines, Russell; Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Moore, Arden; Sulyma, Peter

    2005-01-01

    In this paper computational modeling efforts undertaken at NASA Stennis Space Center in support of rocket engine component testing are discussed. Such analyses include structurally complex cryogenic liquid valves and gas valves operating at high pressures and flow rates. Basic modeling and initial successes are documented, and other issues that make valve modeling at SSC somewhat unique are also addressed. These include transient behavior, valve stall, and the determination of flow patterns in LOX valves. Hexahedral structured grids are used for valves that can be simplifies through the use of axisymmetric approximation. Hybrid unstructured methodology is used for structurally complex valves that have disparate length scales and complex flow paths that include strong swirl, local recirculation zones/secondary flow effects. Hexahedral (structured), unstructured, and hybrid meshes are compared for accuracy and computational efficiency. Accuracy is determined using verification and validation techniques.

  12. Newcomer adjustment during organizational socialization: a meta-analytic review of antecedents, outcomes, and methods.

    PubMed

    Bauer, Talya N; Bodner, Todd; Erdogan, Berrin; Truxillo, Donald M; Tucker, Jennifer S

    2007-05-01

    The authors tested a model of antecedents and outcomes of newcomer adjustment using 70 unique samples of newcomers with meta-analytic and path modeling techniques. Specifically, they proposed and tested a model in which adjustment (role clarity, self-efficacy, and social acceptance) mediated the effects of organizational socialization tactics and information seeking on socialization outcomes (job satisfaction, organizational commitment, job performance, intentions to remain, and turnover). The results generally supported this model. In addition, the authors examined the moderating effects of methodology on these relationships by coding for 3 methodological issues: data collection type (longitudinal vs. cross-sectional), sample characteristics (school-to-work vs. work-to-work transitions), and measurement of the antecedents (facet vs. composite measurement). Discussion focuses on the implications of the findings and suggestions for future research. 2007 APA, all rights reserved

  13. Unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, I.J.; Chandler, R.B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientic questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mecha- nisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unied modeling in- terface. The R package unmarked provides such a unied modeling framework, including tools for data exploration, model tting, model criticism, post-hoc analysis, and model comparison.

  14. Single-cell Genomics using Droplet-based Microfluidics

    NASA Astrophysics Data System (ADS)

    Basu, Anindita; Macosko, Evan; Shalek, Alex; McCarroll, Steven; Regev, Aviv; Weitz, Dave

    2014-03-01

    We develop a system to profile the transcriptome of mammalian cells in isolation using reverse emulsion droplet-based microfluidic techniques. This is accomplished by (a) encapsulating and lysing one cell per emulsion droplet, and (b) uniquely barcoding the RNA contents from each cell using unique DNA-barcoded microgel beads. This enables us to study the transcriptional behavior of a large number of cells at single-cell resolution. We then use these techniques to study transcriptional responses of isolated immune cells to precisely controlled chemical and pathological stimuli provided in the emulsion droplet.

  15. Unique Access to Learning

    ERIC Educational Resources Information Center

    Goble, Don

    2009-01-01

    This article describes the many learning opportunities that broadcast technology students at Ladue Horton Watkins High School in St. Louis, Missouri, experience because of their unique access to technology and methods of learning. Through scaffolding, stepladder techniques, and trial by fire, students learn to produce multiple television programs,…

  16. System identification and model reduction using modulating function techniques

    NASA Technical Reports Server (NTRS)

    Shen, Yan

    1993-01-01

    Weighted least squares (WLS) and adaptive weighted least squares (AWLS) algorithms are initiated for continuous-time system identification using Fourier type modulating function techniques. Two stochastic signal models are examined using the mean square properties of the stochastic calculus: an equation error signal model with white noise residuals, and a more realistic white measurement noise signal model. The covariance matrices in each model are shown to be banded and sparse, and a joint likelihood cost function is developed which links the real and imaginary parts of the modulated quantities. The superior performance of above algorithms is demonstrated by comparing them with the LS/MFT and popular predicting error method (PEM) through 200 Monte Carlo simulations. A model reduction problem is formulated with the AWLS/MFT algorithm, and comparisons are made via six examples with a variety of model reduction techniques, including the well-known balanced realization method. Here the AWLS/MFT algorithm manifests higher accuracy in almost all cases, and exhibits its unique flexibility and versatility. Armed with this model reduction, the AWLS/MFT algorithm is extended into MIMO transfer function system identification problems. The impact due to the discrepancy in bandwidths and gains among subsystem is explored through five examples. Finally, as a comprehensive application, the stability derivatives of the longitudinal and lateral dynamics of an F-18 aircraft are identified using physical flight data provided by NASA. A pole-constrained SIMO and MIMO AWLS/MFT algorithm is devised and analyzed. Monte Carlo simulations illustrate its high-noise rejecting properties. Utilizing the flight data, comparisons among different MFT algorithms are tabulated and the AWLS is found to be strongly favored in almost all facets.

  17. An information theory approach to the density of the earth

    NASA Technical Reports Server (NTRS)

    Graber, M. A.

    1977-01-01

    Information theory can develop a technique which takes experimentally determined numbers and produces a uniquely specified best density model satisfying those numbers. A model was generated using five numerical parameters: the mass of the earth, its moment of inertia, three zero-node torsional normal modes (L = 2, 8, 26). In order to determine the stability of the solution, six additional densities were generated, in each of which the period of one of the three normal modes was increased or decreased by one standard deviation. The superposition of the seven models is shown. It indicates that current knowledge of the torsional modes is sufficient to specify the density in the upper mantle but that the lower mantle and core will require smaller standard deviations before they can be accurately specified.

  18. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  19. The Rat Model in Microsurgery Education: Classical Exercises and New Horizons

    PubMed Central

    Shurey, Sandra; Akelina, Yelena; Legagneux, Josette; Malzone, Gerardo; Jiga, Lucian

    2014-01-01

    Microsurgery is a precise surgical skill that requires an extensive training period and the supervision of expert instructors. The classical training schemes in microsurgery have started with multiday experimental courses on the rat model. These courses have offered a low threat supervised high fidelity laboratory setting in which students can steadily and rapidly progress. This simulated environment allows students to make and recognise mistakes in microsurgery techniques and thus shifts any related risks of the early training period from the operating room to the lab. To achieve a high level of skill acquisition before beginning clinical practice, students are trained on a comprehensive set of exercises the rat model can uniquely provide, with progressive complexity as competency improves. This paper presents the utility of the classical rat model in three of the earliest microsurgery training centres and the new prospects that this versatile and expansive training model offers. PMID:24883268

  20. Modeling the directional reflectance from complete homogeneous vegetation canopies with various leaf-orientation distributions

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.

    1984-01-01

    The directional-reflectance distributions of radiant flux from homogeneous vegetation canopies with greater than 90 percent ground cover are analyzed with a radiative-transfer model. The model assumes that the leaves consist of small finite planes with Lambertian properties. Four theoretical canopies with different leaf-orientation distributions were studied: erectophile, spherical, planophile, and heliotropic canopies. The directional-reflectance distributions from the model closely resemble reflectance distributions measured in the field. The physical scattering mechanisms operating in the model explain the variations observed in the reflectance distributions as a function of leaf-orientation distribution, solar zenith angle, and leaf transmittance and reflectance. The simulated reflectance distribution show unique characteristics for each canopy. The basic understanding of the physical scattering properties of the different canopy geometries gained in this study provide a basis for developing techniques to infer leaf-orientation distributions of vegetation canopies from directional remote-sensing measurements.

  1. Fairbairn's Theory of Change.

    PubMed

    Celani, David P

    2016-06-01

    Fairbairn's unique structural theory with its three pairs of selves and objects has proven to be a highly usable and practical model of the human psyche, yet it has remained a minor player in the world of psychoanalysis. There are a number of factors that account for its lack of popularity, foremost among them the timing of the model's introduction to the analytic community. Fairbairn's four successive papers that described his metapsychology (1940, 1941, 1943, and 1944) were published just after Freud's death, when his theory was the dominant model of psychoanalysis. Additionally, Fairbairn's model was incomplete, used unfamiliar terminology, and, in its singularity, forced the analyst to abandon drive theory, the heart of Freud's metapsychology. This paper will examine and update Fairbairn's unique model of change-from the outset of pathology that begins with attachment to bad objects, to their metamorphosis into internal structures and finally to techniques of treatment that reduce their influence on the patients' internal world. The treatment section carefully follows Fairbairn's metapsychology, and focuses first on the analyst becoming a good object in the eyes of the patient, then unearthing bad object memories in a safe and compassionate interpersonal environment, engaging the patient's substructures in a manner that does not intensify preexisting internal templates, and finally aiding the patient in resuming his or her stalled emotional development. This exegesis of Fairbairn original model, along with recent modifications that have been made to it, demonstrates the consistency, clear focus, and utility of this little-known metapsychology.

  2. Performance modeling for large database systems

    NASA Astrophysics Data System (ADS)

    Schaar, Stephen; Hum, Frank; Romano, Joe

    1997-02-01

    One of the unique approaches Science Applications International Corporation took to meet performance requirements was to start the modeling effort during the proposal phase of the Interstate Identification Index/Federal Bureau of Investigations (III/FBI) project. The III/FBI Performance Model uses analytical modeling techniques to represent the III/FBI system. Inputs to the model include workloads for each transaction type, record size for each record type, number of records for each file, hardware envelope characteristics, engineering margins and estimates for software instructions, memory, and I/O for each transaction type. The model uses queuing theory to calculate the average transaction queue length. The model calculates a response time and the resources needed for each transaction type. Outputs of the model include the total resources needed for the system, a hardware configuration, and projected inherent and operational availability. The III/FBI Performance Model is used to evaluate what-if scenarios and allows a rapid response to engineering change proposals and technical enhancements.

  3. Assimilation of GRACE Terrestrial Water Storage Data into a Land Surface Model: Results for the Mississippi River Basin

    NASA Technical Reports Server (NTRS)

    Zaitchik, Benjamin F.; Rodell, Matthew; Reichle, Rolf H.

    2007-01-01

    NASA's GRACE mission has the potential to be extremely valuable for water resources applications and global water cycle research. What makes GRACE unique among Earth Science satellite systems is that it is able to monitor variations in water stored in all forms, from snow and surface water to soil moisture to groundwater in the deepest aquifers. However, the space and time resolutions of GRACE observations are coarse. GRACE typically resolves water storage changes over regions the size of Nebraska on a monthly basis, while city-scale, daily observations would be more useful for water management, agriculture, and weather prediction. High resolution numerical (computer) hydrology models have been developed, which predict the fates of water and energy after they strike the land surface as precipitation and sunlight. These are similar to weather and climate forecast models, which simulate atmospheric processes. We integrated the GRACE observations into a hydrology model using an advanced technique called data assimilation. The results were new estimates of groundwater, soil moisture, and snow variations, which combined the veracity of GRACE with the high resolution of the model. We tested the technique over the Mississippi River basin, but it will be even more valuable in parts of the world which lack reliable data on water availability.

  4. The analysis of cable forces based on natural frequency

    NASA Astrophysics Data System (ADS)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  5. Unique equilibrium states for Bonatti–Viana diffeomorphisms

    NASA Astrophysics Data System (ADS)

    Climenhaga, Vaughn; Fisher, Todd; Thompson, Daniel J.

    2018-06-01

    We show that the robustly transitive diffeomorphisms constructed by Bonatti and Viana have unique equilibrium states for natural classes of potentials. In particular, we characterize the SRB measure as the unique equilibrium state for a suitable geometric potential. The techniques developed are applicable to a wide class of DA diffeomorphisms, and persist under C 1 perturbations of the map. These results are an application of general machinery developed by the first and last named authors.

  6. X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin

    NASA Astrophysics Data System (ADS)

    Laycock, Silas

    We propose to expand the scope of our successful project providing a multi-satellite library of X-ray Pulsar observations to the community. The library provides high-level products, activity monitoring, pulse-profiles, phased event files, spectra, and a unique pulse-profile modeling interface. The library's scientific footprint will expand in 4 key directions: (1) Update, by processing all new XMM-Newton and Chandra observations (2015-2017) of X-ray Binary Pulsars in the Magellanic Clouds. (2) Expand, by including all archival Suzaku, Swift and NuStar observations, and including Galactic pulsars. (3) Improve, by offering innovative data products that provide deeper insight. (4) Advance, by implementing a new generation of physically motivated emission and pulse-profile models. The library currently includes some 2000 individual RXTE-PCA, 200 Chandra ACIS-I, and 120 XMM-PN observations of the SMC spanning 15 years, creating an unrivaled record of pulsar temporal behavior. In Phase-2, additional observations of SMC pulsars will be added: 221 Chandra (ACIS-S and ACIS-I), 22 XMM-PN, 142 XMM-MOS, 92 Suzaku, 25 NuSTAR, and >10,000 Swift; leveraging our pipeline and analysis techniques already developed. With the addition of 7 Galactic pulsars each having many hundred multisatellite observations, these datasets cover the entire range of variability timescales and accretion regimes. We will model the pulse-profiles using state of the art techniques to parameterize their morphology and obtain the distribution of offsets between magnetic and spin axes, and create samples of profiles under specific accretion modes (whether pencil-beam or fan-beam dominated). These products are needed for the next generation of advances in neutron star theory and modeling. The long-duration of the dataset and “whole-galaxy" nature of the SMC sample make possible a new statistical approach to uncover the duty-cycle distribution and hence population demographics of transient High Mass X-ray Binary (HMXB) populations. Our unique library is already fueling progress on fundamental NS parameters and accretion physics.

  7. Summer Synthesis Institutes: A Novel Approach for Transformative Research and Student Career Development

    NASA Astrophysics Data System (ADS)

    Wilson, J.; Hermans, C. M.

    2010-12-01

    It is believed that breakthroughs tend to occur when small groups of highly motivated scientists are driven by challenges encountered in real problem-solving situations and given the freedom to experiment with new ideas. Summer synthesis institutes provide a mechanism to facilitiate these breakthroughs and by which graduate students may engage in interdisciplinary research in a way that is not often available in their normal course of study. In this presentation we examine two complementary models of summer synthesis institutes in hydrology, how these intensive programs facilitate scientific outcomes and the impact of synthesis and the summer institute model on student perceptions of academic roles, collaboration opportunities and team science. Five summer synthesis institutes were held over three years, sharing similar duration and structure but different degrees of participant interdisciplinarity and focus questions. Through informal assessments, this presentation will demonstrate how these programs offered a unique opportunity for the development of student-student and student-mentor relationships and facilitated deeper understanding of a student’s own research as well as new techniques, perspective and disciplines. Additionally, though the summer synthesis institute model offers a unique ability to leverage limited funding (on the order of a single graduate student) to advance earth sciences, the model also presents specific challenges for research follow-through and may require specific content and interpersonal dynamics for optimum success.

  8. Boundary Layer Measurements in a Supersonic Wind Tunnel Using Doppler Global Velocimetry

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.

    2010-01-01

    A modified Doppler Global Velocimeter (DGV) was developed to measure the velocity within the boundary layer above a flat plate in a supersonic flow. Classic laser velocimetry (LV) approaches could not be used since the model surface was composed of a glass-ceramic insulator in support of heat-transfer measurements. Since surface flare limited the use of external LV techniques and windows placed in the model would change the heat transfer characteristics of the flat plate, a novel approach was developed. The input laser beam was divided into nine equal power beams and each transmitted through optical fibers to a small cavity within the model. The beams were then directed through 1.6-mm diameter orifices to form a series of orthogonal beams emitted from the model and aligned with the tunnel centerline to approximate a laser light sheet. Scattered light from 0.1-micron diameter water condensation ice crystals was collected by four 5-mm diameter lenses and transmitted by their respective optical fiber bundles to terminate at the image plane of a standard two-camera DGV receiver. Flow measurements were made over a range from 0.5-mm above the surface to the freestream at Mach 3.51 in steady state and heat pulse injected flows. This technique provides a unique option for measuring boundary layers in supersonic flows where seeding the flow is problematic or where the experimental apparatus does not provide the optical access required by other techniques.

  9. A theory for the retrieval of virtual temperature from winds, radiances and the equations of fluid dynamics

    NASA Technical Reports Server (NTRS)

    Tzvi, G. C.

    1986-01-01

    A technique to deduce the virtual temperature from the combined use of the equations of fluid dynamics, observed wind and observed radiances is described. The wind information could come from ground-based sensitivity very high frequency (VHF) Doppler radars and/or from space-borne Doppler lidars. The radiometers are also assumed to be either space-borne and/or ground-based. From traditional radiometric techniques the vertical structure of the temperature can be estimated only crudely. While it has been known for quite some time that the virtual temperature could be deduced from wind information only, such techniques had to assume the infallibility of certain diagnostic relations. The proposed technique is an extension of the Gal-Chen technique. It is assumed that due to modeling uncertainties the equations of fluid dynamics are satisfied only in the least square sense. The retrieved temperature, however, is constrained to reproduce the observed radiances. It is shown that the combined use of the three sources of information (wind, radiances and fluid dynamical equations) can result in a unique determination of the vertical temperature structure with spatial and temporal resolution comparable to that of the observed wind.

  10. Marketing the Uniqueness of Small Towns. Small Town Strategy.

    ERIC Educational Resources Information Center

    Hogg, David H.; Dunn, Douglas

    A small town can strengthen its local economy as a result of business people and concerned citizens collectively identifying that community's uniqueness and then capitalizing on it via advertising, personal selling, sales promotion, or publicity. This publication relates the science of marketing to communities. Seven simple techniques are provided…

  11. Adapting CBT for traumatized refugees and ethnic minority patients: examples from culturally adapted CBT (CA-CBT).

    PubMed

    Hinton, Devon E; Rivera, Edwin I; Hofmann, Stefan G; Barlow, David H; Otto, Michael W

    2012-04-01

    In this article, we illustrate how cognitive behavioral therapy (CBT) can be adapted for the treatment of PTSD among traumatized refugees and ethnic minority populations, providing examples from our treatment, culturally adapted CBT, or CA-CBT. CA-CBT has a unique approach to exposure (typical exposure is poorly tolerated in these groups), emphasizes the treatment of somatic sensations (a particularly salient part of the presentation of PTSD in these groups), and addresses comorbid anxiety disorders and anger. To accomplish these treatment goals, CA-CBT emphasizes emotion exposure and emotion regulation techniques such as meditation and aims to promote emotional and psychological flexibility. We describe 12 key aspects of adapting CA-CBT that make it a culturally sensitive treatment of traumatized refugee and ethnic minority populations. We discuss three models that guide our treatment and that can be used to design culturally sensitive treatments: (a) the panic attack-PTSD model to illustrate the many processes that generate PTSD in these populations, highlighting the role of arousal and somatic symptoms; (b) the arousal triad to demonstrate how somatic symptoms are produced and the importance of targeting comorbid anxiety conditions and psychopathological processes; and (c) the multisystem network (MSN) model of emotional state to reveal how some of our therapeutic techniques (e.g., body-focused techniques: bodily stretching paired with self-statements) bring about psychological flexibility and improvement.

  12. Leadership development in the age of the algorithm.

    PubMed

    Buckingham, Marcus

    2012-06-01

    By now we expect personalized content--it's routinely served up by online retailers and news services, for example. But the typical leadership development program still takes a formulaic, one-size-fits-all approach. And it rarely happens that an excellent technique can be effectively transferred from one leader to all others. Someone trying to adopt a practice from a leader with a different style usually seems stilted and off--a Franken-leader. Breakthrough work at Hilton Hotels and other organizations shows how companies can use an algorithmic model to deliver training tips uniquely suited to each individual's style. It's a five-step process: First, a company must choose a tool with which to identify each person's leadership type. Second, it should assess its best leaders, and third, it should interview them about their techniques. Fourth, it should use its algorithmic model to feed tips drawn from those techniques to developing leaders of the same type. And fifth, it should make the system dynamically intelligent, with user reactions sharpening the content and targeting of tips. The power of this kind of system--highly customized, based on peer-to-peer sharing, and continually evolving--will soon overturn the generic model of leadership development. And such systems will inevitably break through any one organization, until somewhere in the cloud the best leadership tips from all over are gathered, sorted, and distributed according to which ones suit which people best.

  13. Textured silicon nitride: processing and anisotropic properties

    PubMed Central

    Zhu, Xinwen; Sakka, Yoshio

    2008-01-01

    Textured silicon nitride (Si3N4) has been intensively studied over the past 15 years because of its use for achieving its superthermal and mechanical properties. In this review we present the fundamental aspects of the processing and anisotropic properties of textured Si3N4, with emphasis on the anisotropic and abnormal grain growth of β-Si3N4, texture structure and texture analysis, processing methods and anisotropic properties. On the basis of the texturing mechanisms, the processing methods described in this article have been classified into two types: hot-working (HW) and templated grain growth (TGG). The HW method includes the hot-pressing, hot-forging and sinter-forging techniques, and the TGG method includes the cold-pressing, extrusion, tape-casting and strong magnetic field alignment techniques for β-Si3N4 seed crystals. Each processing technique is thoroughly discussed in terms of theoretical models and experimental data, including the texturing mechanisms and the factors affecting texture development. Also, methods of synthesizing the rodlike β-Si3N4 single crystals are presented. Various anisotropic properties of textured Si3N4 and their origins are thoroughly described and discussed, such as hardness, elastic modulus, bending strength, fracture toughness, fracture energy, creep behavior, tribological and wear behavior, erosion behavior, contact damage behavior and thermal conductivity. Models are analyzed to determine the thermal anisotropy by considering the intrinsic thermal anisotropy, degree of orientation and various microstructure factors. Textured porous Si3N4 with a unique microstructure composed of oriented elongated β-Si3N4 and anisotropic pores is also described for the first time, with emphasis on its unique mechanical and thermal-mechanical properties. Moreover, as an important related material, textured α-Sialon is also reviewed, because the presence of elongated α-Sialon grains allows the production of textured α-Sialon using the same methods as those used for textured β-Si3N4 and β-Sialon. PMID:27877995

  14. Distinguishing tracheal and esophageal tissues with hyperspectral imaging and fiber-optic sensing

    NASA Astrophysics Data System (ADS)

    Nawn, Corinne D.; Souhan, Brian E.; Carter, Robert, III; Kneapler, Caitlin; Fell, Nicholas; Ye, Jing Yong

    2016-11-01

    During emergency medical situations, where the patient has an obstructed airway or necessitates respiratory support, endotracheal intubation (ETI) is the medical technique of placing a tube into the trachea in order to facilitate adequate ventilation of the lungs. Complications during ETI, such as repeated attempts, failed intubation, or accidental intubation of the esophagus, can lead to severe consequences or ultimately death. Consequently, a need exists for a feedback mechanism to aid providers in performing successful ETI. Our study examined the spectral reflectance properties of the tracheal and esophageal tissue to determine whether a unique spectral profile exists for either tissue for the purpose of detection. The study began by using a hyperspectral camera to image excised pig tissue samples exposed to white and UV light in order to capture the spectral reflectance properties with high fidelity. After identifying a unique spectral characteristic of the trachea that significantly differed from esophageal tissue, a follow-up investigation used a fiber optic probe to confirm the detectability and consistency of the different reflectance characteristics in a pig model. Our results characterize the unique and consistent spectral reflectance characteristic of tracheal tissue, thereby providing foundational support for exploiting spectral properties to detect the trachea during medical procedures.

  15. Imaging fast electrical activity in the brain with electrical impedance tomography

    PubMed Central

    Aristovich, Kirill Y.; Packham, Brett C.; Koo, Hwan; Santos, Gustavo Sato dos; McEvoy, Andy; Holder, David S.

    2016-01-01

    Imaging of neuronal depolarization in the brain is a major goal in neuroscience, but no technique currently exists that could image neural activity over milliseconds throughout the whole brain. Electrical impedance tomography (EIT) is an emerging medical imaging technique which can produce tomographic images of impedance changes with non-invasive surface electrodes. We report EIT imaging of impedance changes in rat somatosensory cerebral cortex with a resolution of 2 ms and < 200 μm during evoked potentials using epicortical arrays with 30 electrodes. Images were validated with local field potential recordings and current source-sink density analysis. Our results demonstrate that EIT can image neural activity in a volume 7 × 5 × 2 mm in somatosensory cerebral cortex with reduced invasiveness, greater resolution and imaging volume than other methods. Modeling indicates similar resolutions are feasible throughout the entire brain so this technique, uniquely, has the potential to image functional connectivity of cortical and subcortical structures. PMID:26348559

  16. Microscale Patterning of Thermoplastic Polymer Surfaces by Selective Solvent Swelling

    PubMed Central

    Rahmanian, Omid; Chen, Chien-Fu; DeVoe, Don L.

    2012-01-01

    A new method for the fabrication of microscale features in thermoplastic substrates is presented. Unlike traditional thermoplastic microfabrication techniques, in which bulk polymer is displaced from the substrate by machining or embossing, a unique process termed orogenic microfabrication has been developed in which selected regions of a thermoplastic surface are raised from the substrate by an irreversible solvent swelling mechanism. The orogenic technique allows thermoplastic surfaces to be patterned using a variety of masking methods, resulting in three-dimensional features that would be difficult to achieve through traditional microfabrication methods. Using cyclic olefin copolymer as a model thermoplastic material, several variations of this process are described to realize growth heights ranging from several nanometers to tens of microns, with patterning techniques include direct photoresist masking, patterned UV/ozone surface passivation, elastomeric stamping, and noncontact spotting. Orogenic microfabrication is also demonstrated by direct inkjet printing as a facile photolithography-free masking method for rapid desktop thermoplastic microfabrication. PMID:22900539

  17. Tomographic inversion techniques incorporating physical constraints for line integrated spectroscopy in stellarators and tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pablant, N. A.; Bell, R. E.; Bitter, M.

    2014-11-15

    Accurate tomographic inversion is important for diagnostic systems on stellarators and tokamaks which rely on measurements of line integrated emission spectra. A tomographic inversion technique based on spline optimization with enforcement of constraints is described that can produce unique and physically relevant inversions even in situations with noisy or incomplete input data. This inversion technique is routinely used in the analysis of data from the x-ray imaging crystal spectrometer (XICS) installed at the Large Helical Device. The XICS diagnostic records a 1D image of line integrated emission spectra from impurities in the plasma. Through the use of Doppler spectroscopy andmore » tomographic inversion, XICS can provide profile measurements of the local emissivity, temperature, and plasma flow. Tomographic inversion requires the assumption that these measured quantities are flux surface functions, and that a known plasma equilibrium reconstruction is available. In the case of low signal levels or partial spatial coverage of the plasma cross-section, standard inversion techniques utilizing matrix inversion and linear-regularization often cannot produce unique and physically relevant solutions. The addition of physical constraints, such as parameter ranges, derivative directions, and boundary conditions, allow for unique solutions to be reliably found. The constrained inversion technique described here utilizes a modified Levenberg-Marquardt optimization scheme, which introduces a condition avoidance mechanism by selective reduction of search directions. The constrained inversion technique also allows for the addition of more complicated parameter dependencies, for example, geometrical dependence of the emissivity due to asymmetries in the plasma density arising from fast rotation. The accuracy of this constrained inversion technique is discussed, with an emphasis on its applicability to systems with limited plasma coverage.« less

  18. Tomographic inversion techniques incorporating physical constraints for line integrated spectroscopy in stellarators and tokamaksa)

    DOE PAGES

    Pablant, N. A.; Bell, R. E.; Bitter, M.; ...

    2014-08-08

    Accurate tomographic inversion is important for diagnostic systems on stellarators and tokamaks which rely on measurements of line integrated emission spectra. A tomographic inversion technique based on spline optimization with enforcement of constraints is described that can produce unique and physically relevant inversions even in situations with noisy or incomplete input data. This inversion technique is routinely used in the analysis of data from the x-ray imaging crystal spectrometer (XICS) installed at LHD. The XICS diagnostic records a 1D image of line integrated emission spectra from impurities in the plasma. Through the use of Doppler spectroscopy and tomographic inversion, XICSmore » can provide pro file measurements of the local emissivity, temperature and plasma flow. Tomographic inversion requires the assumption that these measured quantities are flux surface functions, and that a known plasma equilibrium reconstruction is available. In the case of low signal levels or partial spatial coverage of the plasma cross-section, standard inversion techniques utilizing matrix inversion and linear-regularization often cannot produce unique and physically relevant solutions. The addition of physical constraints, such as parameter ranges, derivative directions, and boundary conditions, allow for unique solutions to be reliably found. The constrained inversion technique described here utilizes a modifi ed Levenberg-Marquardt optimization scheme, which introduces a condition avoidance mechanism by selective reduction of search directions. The constrained inversion technique also allows for the addition of more complicated parameter dependencies, for example geometrical dependence of the emissivity due to asymmetries in the plasma density arising from fast rotation. The accuracy of this constrained inversion technique is discussed, with an emphasis on its applicability to systems with limited plasma coverage.« less

  19. CAMD studies of coal structure and coal liquefaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faulon, J.L.; Carlson, G.A.

    The macromolecular structure of coal is essential to understand the mechanisms occurring during coal liquefaction. Many attempts to model coal structure can be found in the literature. More specifically for high volatile bituminous coal, the subject of interest the most commonly quoted models are the models of Given, Wiser, Solomon, and Shinn. In past work, the authors`s have used computer-aided molecular design (CAMD) to develop three-dimensional representations for the above coal models. The three-dimensional structures were energy minimized using molecular mechanics and molecular dynamics. True density and micopore volume were evaluated for each model. With the exception of Given`s model,more » the computed density values were found to be in agreement with the corresponding experimental results. The above coal models were constructed by a trial and error technique consisting of a manual fitting of the-analytical data. It is obvious that for each model the amount of data is small compared to the actual complexity of coal, and for all of the models more than one structure can be built. Hence, the process by which one structure is chosen instead of another is not clear. In fact, all the authors agree that the structure they derived was only intended to represent an {open_quotes}average{close_quotes} coal model rather than a unique correct structure. The purpose of this program is further develop CAMD techniques to increase the understanding of coal structure and its relationship to coal liquefaction.« less

  20. Optimization techniques for integrating spatial data

    USGS Publications Warehouse

    Herzfeld, U.C.; Merriam, D.F.

    1995-01-01

    Two optimization techniques ta predict a spatial variable from any number of related spatial variables are presented. The applicability of the two different methods for petroleum-resource assessment is tested in a mature oil province of the Midcontinent (USA). The information on petroleum productivity, usually not directly accessible, is related indirectly to geological, geophysical, petrographical, and other observable data. This paper presents two approaches based on construction of a multivariate spatial model from the available data to determine a relationship for prediction. In the first approach, the variables are combined into a spatial model by an algebraic map-comparison/integration technique. Optimal weights for the map comparison function are determined by the Nelder-Mead downhill simplex algorithm in multidimensions. Geologic knowledge is necessary to provide a first guess of weights to start the automatization, because the solution is not unique. In the second approach, active set optimization for linear prediction of the target under positivity constraints is applied. Here, the procedure seems to select one variable from each data type (structure, isopachous, and petrophysical) eliminating data redundancy. Automating the determination of optimum combinations of different variables by applying optimization techniques is a valuable extension of the algebraic map-comparison/integration approach to analyzing spatial data. Because of the capability of handling multivariate data sets and partial retention of geographical information, the approaches can be useful in mineral-resource exploration. ?? 1995 International Association for Mathematical Geology.

  1. a Single-Exposure Dual-Energy Computed Radiography Technique for Improved Nodule Detection and Classification in Chest Imaging

    NASA Astrophysics Data System (ADS)

    Zink, Frank Edward

    The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology assessment.

  2. Exploring Biological Classification: The Unique Organism Project

    ERIC Educational Resources Information Center

    Haines, Sarah; Richman, Laila; Hartley, Renee; Schmid, Rachel

    2017-01-01

    The unique organism project was designed as a culminating assessment for a biological classification unit in a middle school setting. Students developed a model to represent their unique organism. Using the model, students were required to demonstrate how their unique organism interacts with its environment, and how its internal and external…

  3. Engineering 3D Models of Tumors and Bone to Understand Tumor-Induced Bone Disease and Improve Treatments

    PubMed Central

    Kwakwa, Kristin A.; Vanderburgh, Joseph P.; Guelcher, Scott A.

    2018-01-01

    Purpose of Review Bone is a structurally unique microenvironment that presents many challenges for the development of 3D models for studying bone physiology and diseases, including cancer. As researchers continue to investigate the interactions within the bone microenvironment, the development of 3D models of bone has become critical. Recent Findings 3D models have been developed that replicate some properties of bone, but have not fully reproduced the complex structural and cellular composition of the bone microenvironment. This review will discuss 3D models including polyurethane, silk, and collagen scaffolds that have been developed to study tumor-induced bone disease. In addition, we discuss 3D printing techniques used to better replicate the structure of bone. Summary 3D models that better replicate the bone microenvironment will help researchers better understand the dynamic interactions between tumors and the bone microenvironment, ultimately leading to better models for testing therapeutics and predicting patient outcomes. PMID:28646444

  4. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  5. Development of Tokamak Transport Solvers for Stiff Confinement Systems

    NASA Astrophysics Data System (ADS)

    St. John, H. E.; Lao, L. L.; Murakami, M.; Park, J. M.

    2006-10-01

    Leading transport models such as GLF23 [1] and MM95 [2] describe turbulent plasma energy, momentum and particle flows. In order to accommodate existing transport codes and associated solution methods effective diffusivities have to be derived from these turbulent flow models. This can cause significant problems in predicting unique solutions. We have developed a parallel transport code solver, GCNMP, that can accommodate both flow based and diffusivity based confinement models by solving the discretized nonlinear equations using modern Newton, trust region, steepest descent and homotopy methods. We present our latest development efforts, including multiple dynamic grids, application of two-level parallel schemes, and operator splitting techniques that allow us to combine flow based and diffusivity based models in tokamk simulations. 6pt [1] R.E. Waltz, et al., Phys. Plasmas 4, 7 (1997). [2] G. Bateman, et al., Phys. Plasmas 5, 1793 (1998).

  6. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  7. Basin Scale Estimates of Evapotranspiration Using GRACE and other Observations

    NASA Technical Reports Server (NTRS)

    Rodell, M.; Famiglietti, J. S.; Chen, J.; Seneviratne, S. I.; Viterbo, P.; Holl, S.; Wilson, C. R.

    2004-01-01

    Evapotranspiration is integral to studies of the Earth system, yet it is difficult to measure on regional scales. One estimation technique is a terrestrial water budget, i.e., total precipitation minus the sum of evapotranspiration and net runoff equals the change in water storage. Gravity Recovery and Climate Experiment (GRACE) satellite gravity observations are now enabling closure of this equation by providing the terrestrial water storage change. Equations are presented here for estimating evapotranspiration using observation based information, taking into account the unique nature of GRACE observations. GRACE water storage changes are first substantiated by comparing with results from a land surface model and a combined atmospheric-terrestrial water budget approach. Evapotranspiration is then estimated for 14 time periods over the Mississippi River basin and compared with output from three modeling systems. The GRACE estimates generally lay in the middle of the models and may provide skill in evaluating modeled evapotranspiration.

  8. Damascus Steel Revisited

    NASA Astrophysics Data System (ADS)

    Verhoeven, J. D.; Pendray, A. H.; Dauksch, W. E.; Wagstaff, S. R.

    2018-05-01

    A review is given of the work we presented in the 1990s that successfully developed a technique for reproducing the surface patterns and internal microstructure of genuine Damascus steel blades. That work showed that a key factor in making these blades was the addition of quite small levels of carbide-forming elements, notably V. Experiments are presented for blades made from slow- and fast-cooled ingots, and the results support our previous hypothesis that the internal banded microstructure results from microsegregation of V between dendrites during ingot solidification. A hypothetical model was presented for the mechanism causing the unique internal microstructure that gives rise to the surface pattern forming during the forging of the ingots from which the blades are made. This article attempts to explain the model more clearly and presents some literature data that offer support to the model. It also discusses an alternate model recently proposed by Foll.

  9. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  10. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures.

    PubMed

    Montgomery, L D; Montgomery, R W; Guisado, R

    1995-05-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  11. Single-Shot Measurement of Temporally-Dependent Polarization State of Femtosecond Pulses by Angle-Multiplexed Spectral-Spatial Interferometry

    NASA Astrophysics Data System (ADS)

    Lin, Ming-Wei; Jovanovic, Igor

    2016-09-01

    We demonstrate that temporally-dependent polarization states of ultrashort laser pulses can be reconstructed in a single shot by use of an angle-multiplexed spatial-spectral interferometry. This is achieved by introducing two orthogonally polarized reference pulses and interfering them with an arbitrarily polarized ultrafast pulse under measurement. A unique calibration procedure is developed for this technique which facilitates the subsequent polarization state measurements. The accuracy of several reconstructed polarization states is verified by comparison with that obtained from an analytic model that predicts the polarization state on the basis of its method of production. Laser pulses with mJ-level energies were characterized via this technique, including a time-dependent polarization state that can be used for polarization-gating of high-harmonic generation for production of attosecond pulses.

  12. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.; Montgomery, R. W.; Guisado, R.

    1995-01-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  13. Application of Hyperspectral Remote Sensing Techniques to Evaluate Water Quality in Turbid Coastal Waters of South Carolina.

    NASA Astrophysics Data System (ADS)

    Ali, K. A.; Ryan, K.

    2014-12-01

    Coastal and inland waters represent a diverse set of resources that support natural habitat and provide valuable ecosystem services to the human population. Conventional techniques to monitor water quality using in situ sensors and laboratory analysis of water samples can be very time- and cost-intensive. Alternatively, remote sensing techniques offer better spatial coverage and temporal resolution to accurately characterize the dynamic and unique water quality parameters. Existing remote sensing ocean color products, such as the water quality proxy chlorophyll-a, are based on ocean derived bio-optical models that are primarily calibrated in Case 1 type waters. These traditional models fail to work when applied in turbid (Case 2 type), coastal waters due to spectral interference from other associated color producing agents such as colored dissolved organic matter and suspended sediments. In this work, we introduce a novel technique for the predictive modeling of chlorophyll-a using a multivariate-based approach applied to in situ hyperspectral radiometric data collected from the coastal waters of Long Bay, South Carolina. This method uses a partial least-squares regression model to identify prominent wavelengths that are more sensitive to chlorophyll-a relative to other associated color-producing agents. The new model was able to explain 80% of the observed chlorophyll-a variability in Long Bay with RMSE = 2.03 μg/L. This approach capitalizes on the spectral advantage gained from current and future hyperspectral sensors, thus providing a more robust predicting model. This enhanced mode of water quality monitoring in marine environments will provide insight to point-sources and problem areas that may contribute to a decline in water quality. The utility of this tool is in its versatility to a diverse set of coastal waters and its use by coastal and fisheries managers with regard to recreation, regulation, economic and public health purposes.

  14. Stardust to Planetesimals: A Chondrule Connection?

    NASA Technical Reports Server (NTRS)

    Paque, Julie; Bunch, Ted

    1997-01-01

    The unique nature of chondrules has been known for nearly two centuries. Modern techniques of analysis have shown that these millimeter sized silicate objects are among the oldest objects in our solar system. Researchers have devised textural and chemical classification systems for chondrules in an effort to determine their origins. It is agreed that most chondrules were molten at some point in their history, and experimental analogs suggest that the majority of chondrules formed from temperatures below 1600 C at cooling rates in the range of hundreds of degrees per hour. Although interstellar grains are present in chondrite matrices, their contribution as precursors to chondrule formation is unknown. Models for chondrule formation focus on the pre-planetary solar nebula conditions, although planetary impact models have had proponents.

  15. Scheduling Earth Observing Fleets Using Evolutionary Algorithms: Problem Description and Approach

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Morris, Robert; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We describe work in progress concerning multi-instrument, multi-satellite scheduling. Most, although not all, Earth observing instruments currently in orbit are unique. In the relatively near future, however, we expect to see fleets of Earth observing spacecraft, many carrying nearly identical instruments. This presents a substantially new scheduling challenge. Inspired by successful commercial applications of evolutionary algorithms in scheduling domains, this paper presents work in progress regarding the use of evolutionary algorithms to solve a set of Earth observing related model problems. Both the model problems and the software are described. Since the larger problems will require substantial computation and evolutionary algorithms are embarrassingly parallel, we discuss our parallelization techniques using dedicated and cycle-scavenged workstations.

  16. Comparison of the resulting error in data fusion techniques when used with remote sensing, earth observation, and in-situ data sets for water quality applications

    NASA Astrophysics Data System (ADS)

    Ziemba, Alexander; El Serafy, Ghada

    2016-04-01

    Ecological modeling and water quality investigations are complex processes which can require a high level of parameterization and a multitude of varying data sets in order to properly execute the model in question. Since models are generally complex, their calibration and validation can benefit from the application of data and information fusion techniques. The data applied to ecological models comes from a wide range of sources such as remote sensing, earth observation, and in-situ measurements, resulting in a high variability in the temporal and spatial resolution of the various data sets available to water quality investigators. It is proposed that effective fusion into a comprehensive singular set will provide a more complete and robust data resource with which models can be calibrated, validated, and driven by. Each individual product contains a unique valuation of error resulting from the method of measurement and application of pre-processing techniques. The uncertainty and error is further compounded when the data being fused is of varying temporal and spatial resolution. In order to have a reliable fusion based model and data set, the uncertainty of the results and confidence interval of the data being reported must be effectively communicated to those who would utilize the data product or model outputs in a decision making process[2]. Here we review an array of data fusion techniques applied to various remote sensing, earth observation, and in-situ data sets whose domains' are varied in spatial and temporal resolution. The data sets examined are combined in a manner so that the various classifications, complementary, redundant, and cooperative, of data are all assessed to determine classification's impact on the propagation and compounding of error. In order to assess the error of the fused data products, a comparison is conducted with data sets containing a known confidence interval and quality rating. We conclude with a quantification of the performance of the data fusion techniques and a recommendation on the feasibility of applying of the fused products in operating forecast systems and modeling scenarios. The error bands and confidence intervals derived can be used in order to clarify the error and confidence of water quality variables produced by prediction and forecasting models. References [1] F. Castanedo, "A Review of Data Fusion Techniques", The Scientific World Journal, vol. 2013, pp. 1-19, 2013. [2] T. Keenan, M. Carbone, M. Reichstein and A. Richardson, "The model-data fusion pitfall: assuming certainty in an uncertain world", Oecologia, vol. 167, no. 3, pp. 587-597, 2011.

  17. Examining the Predictive Relations between Two Aspects of Self-Regulation and Growth in Preschool Children’s Early Literacy Skills

    PubMed Central

    Lonigan, Christopher J.; Allan, Darcey M.; Phillips, Beth M.

    2016-01-01

    There is strong evidence that self-regulatory processes are linked to early academic skills both concurrently and longitudinally. The majority of extant longitudinal studies, however, have been conducted using autoregressive techniques that may not accurately model change across time. The purpose of this study was to examine the unique associations between two components of self-regulation, attention and executive functioning (EF), and growth in early literacy skills over the preschool year using latent-growth-curve analysis. The sample included 1,082 preschool children (M-age = 55.0 months, SD = 3.73). Children completed measures of vocabulary, syntax, phonological awareness, print knowledge, cognitive ability, and self-regulation, and children’s classroom teachers completed a behavior rating measure. To examine the independent relations of the self-regulatory skills and cognitive ability with children’s initial early literacy skills and growth across the preschool year, growth models in which the intercept and slope were simultaneously regressed on each of the predictor variables were examined. Because of the significant relation between intercept and slope for most outcomes, slope was regressed on intercept in the models to allow a determination of direct and indirect effects of the predictors on growth in children’s language and literacy skills across the preschool year. In general, both teacher-rated inattention and directly measured EF were uniquely associated with initial skills level; however, only teacher-rated inattention uniquely predicted growth in early literacy skills. These findings suggest that teacher-ratings of inattention may measure an aspect of self-regulation that is particularly associated with the acquisition of academic skills in early childhood. PMID:27854463

  18. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  19. Primordial germ cell-mediated transgenesis and genome editing in birds.

    PubMed

    Han, Jae Yong; Park, Young Hyun

    2018-01-01

    Transgenesis and genome editing in birds are based on a unique germline transmission system using primordial germ cells (PGCs), which is quite different from the mammalian transgenic and genome editing system. PGCs are progenitor cells of gametes that can deliver genetic information to the next generation. Since avian PGCs were first discovered in nineteenth century, there have been numerous efforts to reveal their origin, specification, and unique migration pattern, and to improve germline transmission efficiency. Recent advances in the isolation and in vitro culture of avian PGCs with genetic manipulation and genome editing tools enable the development of valuable avian models that were unavailable before. However, many challenges remain in the production of transgenic and genome-edited birds, including the precise control of germline transmission, introduction of exogenous genes, and genome editing in PGCs. Therefore, establishing reliable germline-competent PGCs and applying precise genome editing systems are critical current issues in the production of avian models. Here, we introduce a historical overview of avian PGCs and their application, including improved techniques and methodologies in the production of transgenic and genome-edited birds, and we discuss the future potential applications of transgenic and genome-edited birds to provide opportunities and benefits for humans.

  20. Real-time imaging of subarachnoid hemorrhage in piglets with electrical impedance tomography.

    PubMed

    Dai, Meng; Wang, Liang; Xu, Canhua; Li, Lianfeng; Gao, Guodong; Dong, Xiuzhen

    2010-09-01

    Subarachnoid hemorrhage (SAH) is one of the most severe medical emergencies in neurosurgery. Early detection or diagnosis would significantly reduce the rate of disability and mortality, and improve the prognosis of the patients. Although the present medical imaging techniques generally have high sensitivity to identify bleeding, the use of an additional, non-invasive imaging technique capable of continuously monitoring SAH is required to prevent contingent bleeding or re-bleeding. In this study, electrical impedance tomography (EIT) was applied to detect the onset of SAH modeled on eight piglets in real time, with the subsequent process being monitored continuously. The experimental SAH model was introduced by one-time injection of 5 ml fresh autologous arterial blood into the cisterna magna. Results showed that resistivity variations within the brain caused by the added blood could be detected using the EIT method and may be associated not only with the resistivity difference among brain tissues, but also with variations of cerebrospinal fluid dynamics. In conclusion, EIT has unique potential for use in clinical practice to provide invaluable real-time neuroimaging data for SAH after the improvement of electrode design, anisotropic realistic modeling and instrumentation.

  1. Ice detection and classification on an aircraft wing with ultrasonic shear horizontal guided waves.

    PubMed

    Gao, Huidong; Rose, Joseph L

    2009-02-01

    Ice accumulation on airfoils has been identified as a primary cause of many accidents in commercial and military aircraft. To improve aviation safety as well as reduce cost and environmental threats related to aircraft icing, sensitive, reliable, and aerodynamically compatible ice detection techniques are in great demand. Ultrasonic guided-wave-based techniques have been proved reliable for "go" and "no go" types of ice detection in some systems including the HALO system, in which the second author of this paper is a primary contributor. In this paper, we propose a new model that takes the ice layer into guided-wave modeling. Using this model, the thickness and type of ice formation can be determined from guided-wave signals. Five experimental schemes are also proposed in this paper based on some unique features identified from the guided- wave dispersion curves. A sample experiment is also presented in this paper, where a 1 mm thick glaze ice on a 2 mm aluminum plate is clearly detected. Quantitative match of the experiment data to theoretical prediction serves as a strong support for future implementation of other testing schemes proposed in this paper.

  2. Analysis of genomic sequences by Chaos Game Representation.

    PubMed

    Almeida, J S; Carriço, J A; Maretzek, A; Noble, P A; Fletcher, M

    2001-05-01

    Chaos Game Representation (CGR) is an iterative mapping technique that processes sequences of units, such as nucleotides in a DNA sequence or amino acids in a protein, in order to find the coordinates for their position in a continuous space. This distribution of positions has two properties: it is unique, and the source sequence can be recovered from the coordinates such that distance between positions measures similarity between the corresponding sequences. The possibility of using the latter property to identify succession schemes have been entirely overlooked in previous studies which raises the possibility that CGR may be upgraded from a mere representation technique to a sequence modeling tool. The distribution of positions in the CGR plane were shown to be a generalization of Markov chain probability tables that accommodates non-integer orders. Therefore, Markov models are particular cases of CGR models rather than the reverse, as currently accepted. In addition, the CGR generalization has both practical (computational efficiency) and fundamental (scale independence) advantages. These results are illustrated by using Escherichia coli K-12 as a test data-set, in particular, the genes thrA, thrB and thrC of the threonine operon.

  3. Global dynamics of a network-based SIQRS epidemic model with demographics and vaccination

    NASA Astrophysics Data System (ADS)

    Huang, Shouying; Chen, Fengde; Chen, Lijuan

    2017-02-01

    This paper investigates a new SIQRS epidemic model with demographics and vaccination on complex heterogeneous networks. We analytically derive the basic reproduction number R0, which determines not only the existence of endemic equilibrium but also the global dynamics of the model. The permanence of the disease and the globally asymptotical stability of disease-free equilibrium are proved in detail. By using a monotone iterative technique, we show that the unique endemic equilibrium is globally attractive under certain conditions. Our results really improve and enrich the results in Li et al (2014) [14]. Interestingly, the basic reproduction number R0 bears no relation to the degree-dependent birth, but our simulations indicate that the degree-dependent birth does affect the epidemic dynamics. Furthermore, we find that quarantine plays a more active role than vaccination in controlling the disease.

  4. A Checklist for Successful Quantitative Live Cell Imaging in Systems Biology

    PubMed Central

    Sung, Myong-Hee

    2013-01-01

    Mathematical modeling of signaling and gene regulatory networks has provided unique insights about systems behaviors for many cell biological problems of medical importance. Quantitative single cell monitoring has a crucial role in advancing systems modeling of molecular networks. However, due to the multidisciplinary techniques that are necessary for adaptation of such systems biology approaches, dissemination to a wide research community has been relatively slow. In this essay, I focus on some technical aspects that are often under-appreciated, yet critical in harnessing live cell imaging methods to achieve single-cell-level understanding and quantitative modeling of molecular networks. The importance of these technical considerations will be elaborated with examples of successes and shortcomings. Future efforts will benefit by avoiding some pitfalls and by utilizing the lessons collectively learned from recent applications of imaging in systems biology. PMID:24709701

  5. A New Real - Time Fault Detection Methodology for Systems Under Test. Phase 1

    NASA Technical Reports Server (NTRS)

    Johnson, Roger W.; Jayaram, Sanjay; Hull, Richard A.

    1998-01-01

    The purpose of this research is focussed on the identification/demonstration of critical technology innovations that will be applied to various applications viz. Detection of automated machine Health Monitoring (BM, real-time data analysis and control of Systems Under Test (SUT). This new innovation using a High Fidelity Dynamic Model-based Simulation (BFDMS) approach will be used to implement a real-time monitoring, Test and Evaluation (T&E) methodology including the transient behavior of the system under test. The unique element of this process control technique is the use of high fidelity, computer generated dynamic models to replicate the behavior of actual Systems Under Test (SUT). It will provide a dynamic simulation capability that becomes the reference truth model, from which comparisons are made with the actual raw/conditioned data from the test elements.

  6. The Unique School Environment of Rural Children.

    ERIC Educational Resources Information Center

    Dodendorf, Diane M.

    Recorded observations, camera work, conversations with 19 children in grades K-4 in a Nebraska 2-room school house, and interviews with the teacher were techniques used to assess the advantages and disadvantages of the small rural school environment and its impact on children. Five attributes were found to be significant and unique small school…

  7. Facilitating and securing offline e-medicine service through image steganography.

    PubMed

    Kamal, A H M; Islam, M Mahfuzul

    2014-06-01

    E-medicine is a process to provide health care services to people using the Internet or any networking technology. In this Letter, a new idea is proposed to model the physical structure of the e-medicine system to better provide offline health care services. Smart cards are used to authenticate the user singly. A very unique technique is also suggested to verify the card owner's identity and to embed secret data to the card while providing patients' reports either at booths or at the e-medicine server system. The simulation results of card authentication and embedding procedure justify the proposed implementation.

  8. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  9. Fusion power: a challenge for materials science.

    PubMed

    Duffy, D M

    2010-07-28

    The selection and design of materials that will withstand the extreme conditions of a fusion power plant has been described as one of the greatest materials science challenges in history. The high particle flux, high thermal load, thermal mechanical stress and the production of transmutation elements combine to produce a uniquely hostile environment. In this paper, the materials favoured for the diverse roles in a fusion power plant are discussed, along with the experimental and modelling techniques that are used to advance the understanding of radiation damage in materials. Areas where further research is necessary are highlighted.

  10. A Digital Image-Based Discrete Fracture Network Model and Its Numerical Investigation of Direct Shear Tests

    NASA Astrophysics Data System (ADS)

    Wang, Peitao; Cai, Meifeng; Ren, Fenhua; Li, Changhong; Yang, Tianhong

    2017-07-01

    This paper develops a numerical approach to determine the mechanical behavior of discrete fractures network (DFN) models based on digital image processing technique and particle flow code (PFC2D). A series of direct shear tests of jointed rocks were numerically performed to study the effect of normal stress, friction coefficient and joint bond strength on the mechanical behavior of joint rock and evaluate the influence of micro-parameters on the shear properties of jointed rocks using the proposed approach. The complete shear stress-displacement curve of the DFN model under direct shear tests was presented to evaluate the failure processes of jointed rock. The results show that the peak and residual strength are sensitive to normal stress. A higher normal stress has a greater effect on the initiation and propagation of cracks. Additionally, an increase in the bond strength ratio results in an increase in the number of both shear and normal cracks. The friction coefficient was also found to have a significant influence on the shear strength and shear cracks. Increasing in the friction coefficient resulted in the decreasing in the initiation of normal cracks. The unique contribution of this paper is the proposed modeling technique to simulate the mechanical behavior of jointed rock mass based on particle mechanics approaches.

  11. Global Existence and Uniqueness of Weak and Regular Solutions of Shallow Shells with Thermal Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menzala, G. Perla, E-mail: perla@lncc.br; Cezaro, F. Travessini De, E-mail: fabianacezaro@furg.br

    2016-10-15

    We study a dynamical thin shallow shell whose elastic deformations are described by a nonlinear system of Marguerre–Vlasov’s type under the presence of thermal effects. Our main result is the proof of a global existence and uniqueness of a weak solution in the case of clamped boundary conditions. Standard techniques for uniqueness do not work directly in this case. We overcame this difficulty using recent work due to Lasiecka (Appl Anal 4:1376–1422, 1998).

  12. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  13. TABULATED EQUIVALENT SDR FLAMELET (TESF) MODEFL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KUNDU, PRITHWISH; AMEEN, mUHSIN MOHAMMED; UNNIKRISHNAN, UMESH

    The code consists of an implementation of a novel tabulated combustion model for non-premixed flames in CFD solvers. This novel technique/model is used to implement an unsteady flamelet tabulation without using progress variables for non-premixed flames. It also has the capability to include history effects which is unique within tabulated flamelet models. The flamelet table generation code can be run in parallel to generate tables with large chemistry mechanisms in relatively short wall clock times. The combustion model/code reads these tables. This framework can be coupled with any CFD solver with RANS as well as LES turbulence models. This frameworkmore » enables CFD solvers to run large chemistry mechanisms with large number of grids at relatively lower computational costs. Currently it has been coupled with the Converge CFD code and validated against available experimental data. This model can be used to simulate non-premixed combustion in a variety of applications like reciprocating engines, gas turbines and industrial burners operating over a wide range of fuels.« less

  14. Reactive decontamination of absorbing thin film polymer coatings: model development and parameter determination

    NASA Astrophysics Data System (ADS)

    Varady, Mark; Mantooth, Brent; Pearl, Thomas; Willis, Matthew

    2014-03-01

    A continuum model of reactive decontamination in absorbing polymeric thin film substrates exposed to the chemical warfare agent O-ethyl S-[2-(diisopropylamino)ethyl] methylphosphonothioate (known as VX) was developed to assess the performance of various decontaminants. Experiments were performed in conjunction with an inverse analysis method to obtain the necessary model parameters. The experiments involved contaminating a substrate with a fixed VX exposure, applying a decontaminant, followed by a time-resolved, liquid phase extraction of the absorbing substrate to measure the residual contaminant by chromatography. Decontamination model parameters were uniquely determined using the Levenberg-Marquardt nonlinear least squares fitting technique to best fit the experimental time evolution of extracted mass. The model was implemented numerically in both a 2D axisymmetric finite element program and a 1D finite difference code, and it was found that the more computationally efficient 1D implementation was sufficiently accurate. The resulting decontamination model provides an accurate quantification of contaminant concentration profile in the material, which is necessary to assess exposure hazards.

  15. Knowledge-based probabilistic representations of branching ratios in chemical networks: The case of dissociative recombinations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plessis, Sylvain; Carrasco, Nathalie; Pernot, Pascal

    Experimental data about branching ratios for the products of dissociative recombination of polyatomic ions are presently the unique information source available to modelers of natural or laboratory chemical plasmas. Yet, because of limitations in the measurement techniques, data for many ions are incomplete. In particular, the repartition of hydrogen atoms among the fragments of hydrocarbons ions is often not available. A consequence is that proper implementation of dissociative recombination processes in chemical models is difficult, and many models ignore invaluable data. We propose a novel probabilistic approach based on Dirichlet-type distributions, enabling modelers to fully account for the available information.more » As an application, we consider the production rate of radicals through dissociative recombination in an ionospheric chemistry model of Titan, the largest moon of Saturn. We show how the complete scheme of dissociative recombination products derived with our method dramatically affects these rates in comparison with the simplistic H-loss mechanism implemented by default in all recent models.« less

  16. Canine prostate models in preclinical studies of minimally invasive interventions: part II, benign prostatic hyperplasia models

    PubMed Central

    Báez-Díaz, Claudia; Sánchez-Margallo, Francisco Miguel

    2017-01-01

    Canine prostate is widely used as animal model in the preclinical evaluation of emerging therapeutic interventions. Spontaneous benign prostatic hyperplasia (BPH) is common in adult intact male dogs with two distinct pathological types: glandular and complex form of prostatic hyperplasia. The complex form of prostatic hyperplasia, usually occurring in older dogs, represents an ideal model because of its unique pathologic feature, including not only glandular hyperplasia but also an increase in prostate stromal components. The limited commercial availability of adult dogs with spontaneous BPH motivates experimentally induced BPH in young dogs. Hormone-induced canine BPH model has been well established with various hormonal treatment regimens and administration approaches. The goal of this review is to provide the veterinary background in spontaneous BPH in dogs, summarize the techniques in hormonal induction of canine BPH, and highlight the pathological and clinical limitations of the canine models that may lead to distinct therapeutic responses compared to clinical trials in humans. PMID:28725598

  17. Knowledge-based probabilistic representations of branching ratios in chemical networks: the case of dissociative recombinations.

    PubMed

    Plessis, Sylvain; Carrasco, Nathalie; Pernot, Pascal

    2010-10-07

    Experimental data about branching ratios for the products of dissociative recombination of polyatomic ions are presently the unique information source available to modelers of natural or laboratory chemical plasmas. Yet, because of limitations in the measurement techniques, data for many ions are incomplete. In particular, the repartition of hydrogen atoms among the fragments of hydrocarbons ions is often not available. A consequence is that proper implementation of dissociative recombination processes in chemical models is difficult, and many models ignore invaluable data. We propose a novel probabilistic approach based on Dirichlet-type distributions, enabling modelers to fully account for the available information. As an application, we consider the production rate of radicals through dissociative recombination in an ionospheric chemistry model of Titan, the largest moon of Saturn. We show how the complete scheme of dissociative recombination products derived with our method dramatically affects these rates in comparison with the simplistic H-loss mechanism implemented by default in all recent models.

  18. Reactive solute transport in streams: 1. Development of an equilibrium- based model

    USGS Publications Warehouse

    Runkel, Robert L.; Bencala, Kenneth E.; Broshears, Robert E.; Chapra, Steven C.

    1996-01-01

    An equilibrium-based solute transport model is developed for the simulation of trace metal fate and transport in streams. The model is formed by coupling a solute transport model with a chemical equilibrium submodel based on MINTEQ. The solute transport model considers the physical processes of advection, dispersion, lateral inflow, and transient storage, while the equilibrium submodel considers the speciation and complexation of aqueous species, precipitation/dissolution and sorption. Within the model, reactions in the water column may result in the formation of solid phases (precipitates and sorbed species) that are subject to downstream transport and settling processes. Solid phases on the streambed may also interact with the water column through dissolution and sorption/desorption reactions. Consideration of both mobile (water-borne) and immobile (streambed) solid phases requires a unique set of governing differential equations and solution techniques that are developed herein. The partial differential equations describing physical transport and the algebraic equations describing chemical equilibria are coupled using the sequential iteration approach.

  19. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  20. Unique intermetallic compounds prepared by shock wave synthesis

    NASA Technical Reports Server (NTRS)

    Otto, G.; Reece, O. Y.; Roy, U.

    1971-01-01

    Technique compresses fine ground metallic powder mixture beyond crystal fusion point. Absence of vapor pressure voids and elimination of incongruous effects permit application of technique to large scale fabrication of intermetallic compounds with specific characteristics, e.g., semiconduction, superconduction, or magnetic properties.

  1. Problems at the Leading Edge of Space Weathering as Revealed by TEM Combined with Surface Science Techniques

    NASA Astrophysics Data System (ADS)

    Christoffersen, R.; Dukes, C. A.; Keller, L. P.; Rahman, Z.; Baragiola, R. A.

    2015-11-01

    Analytical field-emission TEM techniques cross-correlated with surface analyses by X-ray photoelectron spectroscopy (XPS) provides a unique two-prong approach for characterizing how solar wind ion processing contributes to space weathering.

  2. A Systematic Review of the Modifying Effect of Anaesthetic Drugs on Metastasis in Animal Models for Cancer

    PubMed Central

    Geessink, Florentine J.; Ritskes-Hoitinga, Merel; Scheffer, Gert Jan

    2016-01-01

    Background Distant metastasis or local recurrence after primary tumour resection remain a major clinical problem. The anaesthetic technique used during oncologic surgery is suggested to influence the metastatic process. While awaiting the results of ongoing randomised controlled trials (RCTs), we have analyzed the evidence regarding the influence of anaesthetic drugs on experimental tumour metastasis in animal studies. Methods PubMed and Embase were searched until April 21st, 2015. Studies were included in the systematic review when they 1) assessed the effect of an anaesthetic drug used in clinical practice on the number or incidence of metastasis in animal models with experimental cancer, 2) included an appropriate control group, and 3) presented unique data. Results 20 studies met the inclusion criteria (published between 1958–2010). Data on number of metastases could be retrieved from 17 studies. These studies described 41 independent comparisons, 33 of which could be included in the meta-analysis (MA). The incidence of metastases was studied in 3 unique papers. From these 3 papers, data on 7 independent comparisons could be extracted and included in the MA. Locally administered local anaesthetics appear to decrease the number of metastases (SMD -6.15 [-8.42; -3.88]), whereas general anaesthetics (RD: 0.136 [0.045, 0.226]), and more specifically volatile anaesthetics (SMD 0.54 [0.24; 0.84]), appear to increase the number and risk of metastases in animal models for cancer. Conclusions Anaesthetics influence the number and incidence of metastases in experimental cancer models. Although more high quality experimental research is necessary, based on the currently available evidence from animal studies, there is no indication to suggest that locally administered local anaesthetics are harmful during surgery in cancer patients. Volatile anaesthetics, however, might increase metastasis in animal models and clinical trials investigating this possibly harmful effect should receive priority. The results of our systematic review in animal studies are broadly consistent with clinical reports that anaesthetic technique does seem to affect the tumour metastasis process. PMID:27227779

  3. Modelling, analysis and validation of microwave techniques for the characterisation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Sulaimalebbe, Aslam

    In the last decade, the study of nanoparticle (NP) systems has become a large and interesting research area due to their novel properties and functionalities, which are different from those of the bulk materials, and also their potential applications in different fields. It is vital to understand the behaviour and properties of nano-materials aiming at implementing nanotechnology, controlling their behaviour and designing new material systems with superior performance. Physical characterisation of NPs falls into two main categories, property and structure analysis, where the properties of the NPs cannot be studied without the knowledge of size and structure. The direct measurement of the electrical properties of metal NPs presents a key challenge and necessitates the use of innovative experimental techniques. There have been numerous reports of two/four point resistance measurements of NPs films and also electrical conductivity of NPs films using the interdigitated microarray (IDA) electrode. However, using microwave techniques such as open ended coaxial probe (OCP) and microwave dielectric resonator (DR) for electrical characterisation of metallic NPs are much more accurate and effective compared to other traditional techniques. This is because they are inexpensive, convenient, non-destructive, contactless, hazardless (i.e. at low power) and require no special sample preparation. This research is the first attempt to determine the microwave properties of Pt and Au NP films, which were appealing materials for nano-scale electronics, using the aforementioned microwave techniques. The ease of synthesis, relatively cheap, unique catalytic activities and control over the size and the shape were the main considerations in choosing Pt and Au NPs for the present study. The initial phase of this research was to implement and validate the aperture admittance model for the OCP measurement through experiments and 3D full wave simulation using the commercially available Ansoft High Frequency Structure Simulator (HFSS), followed by the electrical characterisation of synthesised Pt NP films using the novel miniature fabricated OCP technique. The results obtained from this technique provided the inspiration to synthesise and evaluate the microwave properties of Au NPs. The findings from this technique provided the motivation to characterise both the Pt and Au NP films using the DR technique. Unlike the OCP technique, the DR method is highly sensitive but the achievable measurement accuracy is limited since this technique does not have broadband frequency capability like the OCP method. The results obtained from the DR technique show a good agreement with the theoretical prediction. In the last phase of this research, a further validation of the aperture admittance models on different types OCP (i.e. RG-405 and RG-402 cables and SMA connector) have been carried out on the developed 3D full wave models using HFSS software, followed by the development of universal models for the aforementioned OCPs based on the same 3D full wave models.

  4. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey

    PubMed Central

    Koo, Laura W.; Horowitz, Alice M.; Radice, Sarah D.; Wang, Min Q.; Kleinman, Dushanka V.

    2016-01-01

    Objectives We examined nurse practitioners’ use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. Methods A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. Results More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2–3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. Conclusions NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs should increase emphasis on health-literate and patient-centered communication techniques to increase patient understanding of dental caries prevention. Non-dental healthcare providers, such as NPs, are uniquely positioned to contribute to preventing early childhood dental caries through health-literate and patient-centered communication. PMID:26766557

  5. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey.

    PubMed

    Koo, Laura W; Horowitz, Alice M; Radice, Sarah D; Wang, Min Q; Kleinman, Dushanka V

    2016-01-01

    We examined nurse practitioners' use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2-3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs should increase emphasis on health-literate and patient-centered communication techniques to increase patient understanding of dental caries prevention. Non-dental healthcare providers, such as NPs, are uniquely positioned to contribute to preventing early childhood dental caries through health-literate and patient-centered communication.

  6. Superhydrophobic Natural and Artificial Surfaces—A Structural Approach

    PubMed Central

    Avrămescu, Roxana-Elena; Ghica, Mihaela Violeta; Dinu-Pîrvu, Cristina; Prisada, Răzvan; Popa, Lăcrămioara

    2018-01-01

    Since ancient times humans observed animal and plants features and tried to adapt them according to their own needs. Biomimetics represents the foundation of many inventions from various fields: From transportation devices (helicopter, airplane, submarine) and flying techniques, to sports’ wear industry (swimming suits, scuba diving gear, Velcro closure system), bullet proof vests made from Kevlar etc. It is true that nature provides numerous noteworthy models (shark skin, spider web, lotus leaves), referring both to the plant and animal kingdom. This review paper summarizes a few of “nature’s interventions” in human evolution, regarding understanding of surface wettability and development of innovative special surfaces. Empirical models are described in order to reveal the science behind special wettable surfaces (superhydrophobic /superhydrophilic). Materials and methods used in order to artificially obtain special wettable surfaces are described in correlation with plants’ and animals’ unique features. Emphasis is placed on joining superhydrophobic and superhydrophilic surfaces, with important applications in cell culturing, microorganism isolation/separation and molecule screening techniques. Bio-inspired wettability is presented as a constitutive part of traditional devices/systems, intended to improve their characteristics and extend performances. PMID:29789488

  7. Superhydrophobic Natural and Artificial Surfaces-A Structural Approach.

    PubMed

    Avrămescu, Roxana-Elena; Ghica, Mihaela Violeta; Dinu-Pîrvu, Cristina; Prisada, Răzvan; Popa, Lăcrămioara

    2018-05-22

    Since ancient times humans observed animal and plants features and tried to adapt them according to their own needs. Biomimetics represents the foundation of many inventions from various fields: From transportation devices (helicopter, airplane, submarine) and flying techniques, to sports' wear industry (swimming suits, scuba diving gear, Velcro closure system), bullet proof vests made from Kevlar etc. It is true that nature provides numerous noteworthy models (shark skin, spider web, lotus leaves), referring both to the plant and animal kingdom. This review paper summarizes a few of "nature's interventions" in human evolution, regarding understanding of surface wettability and development of innovative special surfaces. Empirical models are described in order to reveal the science behind special wettable surfaces (superhydrophobic /superhydrophilic). Materials and methods used in order to artificially obtain special wettable surfaces are described in correlation with plants' and animals' unique features. Emphasis is placed on joining superhydrophobic and superhydrophilic surfaces, with important applications in cell culturing, microorganism isolation/separation and molecule screening techniques. Bio-inspired wettability is presented as a constitutive part of traditional devices/systems, intended to improve their characteristics and extend performances.

  8. Privacy-protected biometric templates: acoustic ear identification

    NASA Astrophysics Data System (ADS)

    Tuyls, Pim T.; Verbitskiy, Evgeny; Ignatenko, Tanya; Schobben, Daniel; Akkermans, Ton H.

    2004-08-01

    Unique Biometric Identifiers offer a very convenient way for human identification and authentication. In contrast to passwords they have hence the advantage that they can not be forgotten or lost. In order to set-up a biometric identification/authentication system, reference data have to be stored in a central database. As biometric identifiers are unique for a human being, the derived templates comprise unique, sensitive and therefore private information about a person. This is why many people are reluctant to accept a system based on biometric identification. Consequently, the stored templates have to be handled with care and protected against misuse [1, 2, 3, 4, 5, 6]. It is clear that techniques from cryptography can be used to achieve privacy. However, as biometric data are noisy, and cryptographic functions are by construction very sensitive to small changes in their input, and hence one can not apply those crypto techniques straightforwardly. In this paper we show the feasibility of the techniques developed in [5], [6] by applying them to experimental biometric data. As biometric identifier we have choosen the shape of the inner ear-canal, which is obtained by measuring the headphone-to-ear-canal Transfer Functions (HpTFs) which are known to be person dependent [7].

  9. Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2010-01-01

    The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.

  10. Phase change events of volatile liquid perfluorocarbon contrast agents produce unique acoustic signatures

    PubMed Central

    Sheeran, Paul S.; Matsunaga, Terry O.; Dayton, Paul A.

    2015-01-01

    Phase-change contrast agents (PCCAs) provide a dynamic platform to approach problems in medical ultrasound (US). Upon US-mediated activation, the liquid core vaporizes and expands to produce a gas bubble ideal for US imaging and therapy. In this study, we demonstrate through high-speed video microscopy and US interrogation that PCCAs composed of highly volatile perfluorocarbons (PFCs) exhibit unique acoustic behavior that can be detected and differentiated from standard microbubble contrast agents. Experimental results show that when activated with short pulses PCCAs will over-expand and undergo unforced radial oscillation while settling to a final bubble diameter. The size-dependent oscillation phenomenon generates a unique acoustic signal that can be passively detected in both time and frequency domain using confocal piston transducers with an ‘activate high’ (8 MHz, 2 cycles), ‘listen low’ (1 MHz) scheme. Results show that the magnitude of the acoustic ‘signature’ increases as PFC boiling point decreases. By using a band-limited spectral processing technique, the droplet signals can be isolated from controls and used to build experimental relationships between concentration and vaporization pressure. The techniques shown here may be useful for physical studies as well as development of droplet-specific imaging techniques. PMID:24351961

  11. The English Teacher's Survival Guide: Ready-To-Use Techniques & Materials for Grades 7-12. 2nd Edition

    ERIC Educational Resources Information Center

    Brandvik, Mary Lou; McKnight, Katherine S.

    2011-01-01

    This unique time-saving book is packed with tested techniques and materials to assist new and experienced English teachers with virtually every phase of their job from lesson planning to effective discipline techniques. The book includes 175 easy-to-understand strategies, lessons, checklists, and forms for effective classroom management and over…

  12. Photo-Elicitation and Visual Semiotics: A Unique Methodology for Studying Inclusion for Children with Disabilities

    ERIC Educational Resources Information Center

    Stockall, Nancy

    2013-01-01

    The methodology in this paper discusses the use of photographs as an elicitation strategy that can reveal the thinking processes of participants in a qualitatively rich manner. Photo-elicitation techniques combined with a Piercian semiotic perspective offer a unique method for creating a frame of action for later participant analysis. Illustrative…

  13. Surface Tension and Viscosity of SCN and SCN-acetone Alloys at Melting Points and Higher Temperatures Using Surface Light Scattering Spectrometer

    NASA Technical Reports Server (NTRS)

    Tin, Padetha; deGroh, Henry C., III.

    2003-01-01

    Succinonitrile has been and is being used extensively in NASA's Microgravity Materials Science and Fluid Physics programs and as well as in several ground-based and microgravity studies including the Isothermal Dendritic Growth Experiment (IDGE). Succinonitrile (SCN) is useful as a model for the study of metal solidification, although it is an organic material, it has a BCC crystal structure and solidifies dendriticly like a metal. It is also transparent and has a low melting point (58.08 C). Previous measurements of succinonitrile (SCN) and alloys of succinonitrile and acetone surface tensions are extremely limited. Using the Surface Light Scattering technique we have determined non invasively, the surface tension and viscosity of SCN and SCN-Acetone Alloys at different temperatures. This relatively new and unique technique has several advantages over the classical methods such as, it is non invasive, has good accuracy and measures the surface tension and viscosity simultaneously. The accuracy of interfacial energy values obtained from this technique is better than 2% and viscosity about 10 %. Succinonitrile and succinonitrile-acetone alloys are well-established model materials with several essential physical properties accurately known - except the liquid/vapor surface tension at different elevated temperatures. We will be presenting the experimentally determined liquid/vapor surface energy and liquid viscosity of succinonitrile and succinonitrile-acetone alloys in the temperature range from their melting point to around 100 C using this non-invasive technique. We will also discuss about the measurement technique and new developments of the Surface Light Scattering Spectrometer.

  14. An aluminum - ionic liquid interface sustaining a durable Al-air battery

    NASA Astrophysics Data System (ADS)

    Gelman, Danny; Shvartsev, Boris; Wallwater, Itamar; Kozokaro, Shahaf; Fidelsky, Vicky; Sagy, Adi; Oz, Alon; Baltianski, Sioma; Tsur, Yoed; Ein-Eli, Yair

    2017-10-01

    A thorough study of a unique aluminum (Al)-air battery utilizing a pure Al anode, an air cathode, and hydrophilic room temperature ionic liquid electrolyte 1-ethyl-3-methylimidazolium oligofluorohydrogenate [EMIm(HF)2.3F] is reported. The effects of various operation conditions, both at open circuit potential and under discharge modes, on the battery components were discussed. A variety of techniques were utilized to investigate and study the interfaces and processes involved, including electrochemical studies, electron microscopy, spectroscopy and diffraction. As a result of this intensive study, the upon-operation voltage drop (;dip;) obstacle, occurring in the initial stages of the Al-air battery discharge, has been resolved. In addition, the interaction of the Al anode with oligofluorohydrogenate electrolyte forms an Al-O-F layer on the Al surface, which allows both activation and low corrosion rates of the Al anode. The evolution of this layer has been studied via impedance spectroscopy genetic programming enabling a unique model of the Al-air battery.

  15. Free-energy landscape of protein oligomerization from atomistic simulations

    PubMed Central

    Barducci, Alessandro; Bonomi, Massimiliano; Prakash, Meher K.; Parrinello, Michele

    2013-01-01

    In the realm of protein–protein interactions, the assembly process of homooligomers plays a fundamental role because the majority of proteins fall into this category. A comprehensive understanding of this multistep process requires the characterization of the driving molecular interactions and the transient intermediate species. The latter are often short-lived and thus remain elusive to most experimental investigations. Molecular simulations provide a unique tool to shed light onto these complex processes complementing experimental data. Here we combine advanced sampling techniques, such as metadynamics and parallel tempering, to characterize the oligomerization landscape of fibritin foldon domain. This system is an evolutionarily optimized trimerization motif that represents an ideal model for experimental and computational mechanistic studies. Our results are fully consistent with previous experimental nuclear magnetic resonance and kinetic data, but they provide a unique insight into fibritin foldon assembly. In particular, our simulations unveil the role of nonspecific interactions and suggest that an interplay between thermodynamic bias toward native structure and residual conformational disorder may provide a kinetic advantage. PMID:24248370

  16. Overview of Propellant Delivery Systems at the NASA John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Haselmaier, L. Haynes; Field, Robert E.; Ryan, Harry M.; Dickey, Jonathan C.

    2006-01-01

    A wide range of rocket propulsion test work occurs at he NASA John C. Stennis Space Center (SSC) including full-scale engine test activities at test facilities A-1, A-2, B-1 and B-2 as well as combustion device research and development activities at the E-Complex (E-1, E-2. E-3 and E-4) test facilities. One of the greatest challenges associated with operating a test facility is maintaining the health of the primary propellant system and test-critical support systems. The challenge emerges due to the fact that the operating conditions of the various system components are extreme (e.g., low temperatures, high pressures) and due to the fact that many of the components and systems are unique. The purpose of this paper is to briefly describe the experience and modeling techniques that are used to operate the unique test facilities at NASA SSC that continue to support successful propulsion testing.

  17. Design and construction of a remote piloted flying wing. B.S. Thesis

    NASA Technical Reports Server (NTRS)

    Costa, Alfred J.; Koopman, Fritz; Soboleski, Craig; Trieu, Thai-Ba; Duquette, Jaime; Krause, Scott; Susko, David; Trieu, Thuyba

    1994-01-01

    Currently, there is a need for a high-speed, high-lift civilian transport. Although unconventional, a flying wing could fly at speeds in excess of Mach 2 and still retain the capacity of a 747. The design of the flying wing is inherently unstable since it lacks a fuselage and a horizontal tail. The project goal was to design, construct, fly, and test a remote-piloted scale model flying wing. The project was completed as part of the NASA/USRA Advanced Aeronautics Design Program. These unique restrictions required us to implement several fundamental design changes from last year's Elang configuration including wing sweepback and wingtip endplates. Unique features such as a single ducted fan engine, composite structural materials, and an electrostatic stability system were incorporated. The result is the Banshee '94. Our efforts will aid future projects in design and construction techniques so that a viable flying wing can become an integral part of the aviation industry.

  18. Free-energy landscape of protein oligomerization from atomistic simulations.

    PubMed

    Barducci, Alessandro; Bonomi, Massimiliano; Prakash, Meher K; Parrinello, Michele

    2013-12-03

    In the realm of protein-protein interactions, the assembly process of homooligomers plays a fundamental role because the majority of proteins fall into this category. A comprehensive understanding of this multistep process requires the characterization of the driving molecular interactions and the transient intermediate species. The latter are often short-lived and thus remain elusive to most experimental investigations. Molecular simulations provide a unique tool to shed light onto these complex processes complementing experimental data. Here we combine advanced sampling techniques, such as metadynamics and parallel tempering, to characterize the oligomerization landscape of fibritin foldon domain. This system is an evolutionarily optimized trimerization motif that represents an ideal model for experimental and computational mechanistic studies. Our results are fully consistent with previous experimental nuclear magnetic resonance and kinetic data, but they provide a unique insight into fibritin foldon assembly. In particular, our simulations unveil the role of nonspecific interactions and suggest that an interplay between thermodynamic bias toward native structure and residual conformational disorder may provide a kinetic advantage.

  19. The Generation of Novel MR Imaging Techniques to Visualize Inflammatory/Degenerative Mechanisms and the Correlation of MR Data with 3D Microscopic Changes

    DTIC Science & Technology

    2013-09-01

    existing MR scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently...and unique system for analysis of affected brain regions and coupled with other imaging techniques and molecular measurements holds significant...scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently stain

  20. Psychophysical Reverse Correlation with Multiple Response Alternatives

    ERIC Educational Resources Information Center

    Dai, Huanping; Micheyl, Christophe

    2010-01-01

    Psychophysical reverse-correlation methods such as the "classification image" technique provide a unique tool to uncover the internal representations and decision strategies of individual participants in perceptual tasks. Over the past 30 years, these techniques have gained increasing popularity among both visual and auditory psychophysicists.…

  1. A fluorescent imaging technique for quantifying spray deposits on plant leaves

    USDA-ARS?s Scientific Manuscript database

    Because of the unique characteristics of electrostatically-charged sprays, use of traditional methods to quantify deposition from these sprays has been challenging. A new fluorescent imaging technique was developed to quantify spray deposits from electrostatically-charged sprays on natural plant lea...

  2. Behavior Analysis: Methodological Foundations.

    ERIC Educational Resources Information Center

    Owen, James L.

    Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline…

  3. Structure and Dynamics of Type III Secretion Effector Protein ExoU As determined by SDSL-EPR Spectroscopy in Conjunction with De Novo Protein Folding

    PubMed Central

    2017-01-01

    ExoU is a 74 kDa cytotoxin that undergoes substantial conformational changes as part of its function, that is, it has multiple thermodynamically stable conformations that interchange depending on its environment. Such flexible proteins pose unique challenges to structural biology: (1) not only is it often difficult to determine structures by X-ray crystallography for all biologically relevant conformations because of the flat energy landscape (2) but also experimental conditions can easily perturb the biologically relevant conformation. The first challenge can be overcome by applying orthogonal structural biology techniques that are capable of observing alternative, biologically relevant conformations. The second challenge can be addressed by determining the structure in the same biological state with two independent techniques under different experimental conditions. If both techniques converge to the same structural model, the confidence that an unperturbed biologically relevant conformation is observed increases. To this end, we determine the structure of the C-terminal domain of the effector protein, ExoU, from data obtained by electron paramagnetic resonance spectroscopy in conjunction with site-directed spin labeling and in silico de novo structure determination. Our protocol encompasses a multimodule approach, consisting of low-resolution topology sampling, clustering, and high-resolution refinement. The resulting model was compared with an ExoU model in complex with its chaperone SpcU obtained previously by X-ray crystallography. The two models converged to a minimal RMSD100 of 3.2 Å, providing evidence that the unbound structure of ExoU matches the fold observed in complex with SpcU. PMID:28691114

  4. Etude experimentale et modelisation de la digestion anaerobie des matieres organiques residuelles dans des conditions hyperthermophiles =

    NASA Astrophysics Data System (ADS)

    Altamirano, Felipe Ignacio Castro

    This dissertation focuses on the problem of designing rates in the utility sector. It is motivated by recent developments in the electricity industry, where renewable generation technologies and distributed energy resources are becoming increasingly relevant. Both technologies disrupt the sector in unique ways. While renewables make grid operations more complex, and potentially more expensive, distributed energy resources enable consumers to interact two-ways with the grid. Both developments present challenges and opportunities for regulators, who must adapt their techniques for evaluating policies to the emerging technological conditions. The first two chapters of this work make the case for updating existing techniques to evaluate tariff structures. They also propose new methods which are more appropriate given the prospective technological characteristics of the sector. The first chapter constructs an analytic tool based on a model that captures the interaction between pricing and investment. In contrast to previous approaches, this technique allows consistently comparing portfolios of rates while enabling researchers to model with a significantly greater level of detail the supply side of the sector. A key theoretical implication of the model that underlies this technique is that, by properly updating the portfolio of tariffs, a regulator could induce the welfare maximizing adoption of distributed energy resources and enrollment in rate structures. We develop an algorithm to find globally optimal solutions of this model, which is a nonlinear mathematical program. The results of a computational experiment show that the performance of the algorithm dominates that of commercial nonlinear solvers. In addition, to illustrate the practical relevance of the method, we conduct a cost benefit analysis of implementing time-variant tariffs in two electricity systems, California and Denmark. Although portfolios with time-varying rates create value in both systems, these improvements differ enough to advise very different policies. While in Denmark time-varying tariffs appear unattractive, they at least deserve further revision in California. This conclusion is beyond the reach of previous techniques to analyze rates, as they do not capture the interplay between an intermittent supply and a price-responsive demand. While useful, the method we develop in the first chapter has two important limitations. One is the lack of transparency of the parameters that determine demand substitution patterns, and demand heterogeneity; the other is the narrow range of rate structures that could be studied with the technique. Both limitations stem from taking as a primitive a demand function. Following an alternative path, in the second chapter we develop a technique based on a pricing model that has as a fundamental building block the consumer utility maximization problem. Because researchers do not have to limit themselves to problems with unique solutions, this approach significantly increases the flexibility of the model and, in particular, addresses the limitations of the technique we develop in the first chapter. This gain in flexibility decreases the practicality of our method since the underlying model becomes a Bilevel Problem. To be able to handle realistic instances, we develop a decomposition method based on a non-linear variant of the Alternating Direction Method of Multipliers, which combines Conic and Mixed Integer Programming. A numerical experiment shows that the performance of the solution technique is robust to instance sizes and a wide combination of parameters. We illustrate the relevance of the new method with another applied analysis of rate structures. Our results highlight the value of being able to model in detail distributed energy resources. They also show that ignoring transmission constraints can have meaningful impacts on the analysis of rate structures. In addition, we conduct a distributional analysis, which portrays how our method permits regulators and policy makers to study impacts of a rate update on a heterogeneous population. While a switch in rates could have a positive impact on the aggregate of households, it could benefit some more than others, and even harm some customers. Our technique permits to anticipate these impacts, letting regulators decide among rate structures with considerably more information than what would be available with alternative approaches. In the third chapter, we conduct an empirical analysis of rate structures in California, which is currently undergoing a rate reform. To contribute to the ongoing regulatory debate about the future of rates, we analyze in depth a set of plausible tariff alternatives. In our analysis, we focus on a scenario in which advanced metering infrastructure and home energy management systems are widely adopted. Our modeling approach allows us to capture a wide variety of temporal and spatial demand substitution patterns without the need of estimating a large number of parameters. (Abstract shortened by ProQuest.).

  5. Surface-enhanced Raman spectroscopy for the detection of pathogenic DNA and protein in foods

    NASA Astrophysics Data System (ADS)

    Chowdhury, Mustafa H.; Atkinson, Brad; Good, Theresa; Cote, Gerard L.

    2003-07-01

    Traditional Raman spectroscopy while extremely sensitive to structure and conformation, is an ineffective tool for the detection of bioanalytes at the sub milimolar level. Surface Enhanced Raman Spectroscopy (SERS) is a technique developed more recently that has been used with applaudable success to enhance the Raman cross-section of a molecule by factors of 106 to 1014. This technique can be exploited in a nanoscale biosensor for the detection of pathogenic proteins and DNA in foods by using a biorecognition molecule to bring a target analyte in close proximity to the mental surface. This is expected to produce a SERS signal of the target analyte, thus making it possible to easily discriminate between the target analyte and possible confounders. In order for the sensor to be effective, the Raman spectra of the target analyte would have to be distinct from that of the biorecognition molecule, as both would be in close proximity to the metal surface and thus be subjected to the SERS effect. In our preliminary studies we have successfully used citrate reduced silver colloidal particles to obtain unique SERS spectra of α-helical and β-sheet bovine serum albumin (BSA) that served as models of an α helical antiobiody (biorecognition element) and a β-sheet target protein (pathogenic prion). In addition, the unique SERS spectra of double stranded and single stranded DNA were also obtained where the single stranded DNA served as the model for the biorecognition element and the double stranded DNA served as themodel for the DNA probe/target hybrid. This provides a confirmation of the feasibility of the method which opens opportunities for potentially wide spread applications in the detection of food pathogens, biowarefare agents, andother bio-analytes.

  6. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  7. Porosity Estimation By Artificial Neural Networks Inversion . Application to Algerian South Field

    NASA Astrophysics Data System (ADS)

    Eladj, Said; Aliouane, Leila; Ouadfeul, Sid-Ali

    2017-04-01

    One of the main geophysicist's current challenge is the discovery and the study of stratigraphic traps, this last is a difficult task and requires a very fine analysis of the seismic data. The seismic data inversion allows obtaining lithological and stratigraphic information for the reservoir characterization . However, when solving the inverse problem we encounter difficult problems such as: Non-existence and non-uniqueness of the solution add to this the instability of the processing algorithm. Therefore, uncertainties in the data and the non-linearity of the relationship between the data and the parameters must be taken seriously. In this case, the artificial intelligence techniques such as Artificial Neural Networks(ANN) is used to resolve this ambiguity, this can be done by integrating different physical properties data which requires a supervised learning methods. In this work, we invert the acoustic impedance 3D seismic cube using the colored inversion method, then, the introduction of the acoustic impedance volume resulting from the first step as an input of based model inversion method allows to calculate the Porosity volume using the Multilayer Perceptron Artificial Neural Network. Application to an Algerian South hydrocarbon field clearly demonstrate the power of the proposed processing technique to predict the porosity for seismic data, obtained results can be used for reserves estimation, permeability prediction, recovery factor and reservoir monitoring. Keywords: Artificial Neural Networks, inversion, non-uniqueness , nonlinear, 3D porosity volume, reservoir characterization .

  8. Creating wavelet-based models for real-time synthesis of perceptually convincing environmental sounds

    NASA Astrophysics Data System (ADS)

    Miner, Nadine Elizabeth

    1998-09-01

    This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.

  9. A confocal scanning laser ophthalmoscope for retinal vessel oximetry

    NASA Astrophysics Data System (ADS)

    Lompado, Arthur

    Measurement of a person's blood oxygen saturation has long been recognized as a useful metric for the characterizing ailments ranging from chronic respiratory disorders to acute, potentially life threatening, traumas. The ubiquity of oxygen saturation monitors in the medical field, including portable pulse oximeters and laboratory based CO-oximeters, is a testament to the importance of this technique. The work presented here documents the design, fabrication and development of a unique type of oxygen saturation monitor, a confocal scanning retinal vessel oximeter, with the potential to expand the usefulness of the present devices. A large part of the knowledge base required to construct the instrument comes from the consideration of light scattering by red blood cells in a blood vessel. Therefore, a substantial portion of this work is devoted to the process of light scattering by whole human blood and its effects on the development of a more accurate oximeter. This light scattering effect has been both measured and modeled stochastically to determine its contribution to the measured oximeter signal. It is shown that, although well accepted in the published literature, the model only correlates marginally to the measurements due to inherent limitations imposed by the model assumptions. Nonetheless, enough material has been learned about the scattering to allow development of a mathematical model for the interaction of light with blood in a vessel, and this knowledge has been applied to the data reduction of the present oximeter. This data reduction technique has been tested in a controlled experiment employing a model eye with a blood filled mock retinal vessel. It will be shown that the presently developed technique exhibited strong correlation between the known blood oxygen saturation and that calculated by the new system.

  10. Quadrupole magnetic field-flow fractionation: A novel technique for the characterization of magnetic particles

    NASA Astrophysics Data System (ADS)

    Carpino, Francesca

    In the last few decades, the development and use of nanotechnology has become of increasing importance. Magnetic nanoparticles, because of their unique properties, have been employed in many different areas of application. They are generally made of a core of magnetic material coated with some other material to stabilize them and to help disperse them in suspension. The unique feature of magnetic nanoparticles is their response to a magnetic field. They are generally superparamagnetic, in which case they become magnetized only in a magnetic field and lose their magnetization when the field is removed. It is this feature that makes them so useful for drug targeting, hyperthermia and bioseparation. For many of these applications, the synthesis of uniformly sized magnetic nanoparticles is of key importance because their magnetic properties depend strongly on their dimensions. Because of the difficulty of synthesizing monodisperse particulate materials, a technique capable of characterizing the magnetic properties of polydisperse samples is of great importance. Quadrupole magnetic field-flow fractionation (MgFFF) is a technique capable of fractionating magnetic particles based on their content of magnetite or other magnetic material. In MgFFF, the interplay of hydrodynamic and magnetic forces separates the particles as they are carried along a separation channel. Since the magnetic field and the gradient in magnetic field acting on the particles during their migration are known, it is possible to calculate the quantity of magnetic material in the particles according to their time of emergence at the channel outlet. Knowing the magnetic properties of the core material, MgFFF can be used to determine both the size distribution and the mean size of the magnetic cores of polydisperse samples. When magnetic material is distributed throughout the volume of the particles, the derived data corresponds to a distribution in equivalent spherical diameters of magnetic material in the particles. MgFFF is unique in its ability to characterize the distribution in magnetic properties of a particulate sample. This knowledge is not only of importance to the optimization and quality control of particle preparation. It is also of great importance in modeling magnetic cell separation, drug targeting, hyperthermia, and other areas of application.

  11. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  12. High-Resolution Characterization of UMo Alloy Microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devaraj, Arun; Kovarik, Libor; Joshi, Vineet V.

    2016-11-30

    This report highlights the capabilities and procedure for high-resolution characterization of UMo fuels in PNNL. Uranium-molybdenum (UMo) fuel processing steps, from casting to forming final fuel, directly affect the microstructure of the fuel, which in turn dictates the in-reactor performance of the fuel under irradiation. In order to understand the influence of processing on UMo microstructure, microstructure characterization techniques are necessary. Higher-resolution characterization techniques like transmission electron microscopy (TEM) and atom probe tomography (APT) are needed to interrogate the details of the microstructure. The findings from TEM and APT are also directly beneficial for developing predictive multiscale modeling tools thatmore » can predict the microstructure as a function of process parameters. This report provides background on focused-ion-beam–based TEM and APT sample preparation, TEM and APT analysis procedures, and the unique information achievable through such advanced characterization capabilities for UMo fuels, from a fuel fabrication capability viewpoint.« less

  13. 3D Holographic Observatory for Long-term Monitoring of Complex Behaviors in Drosophila

    NASA Astrophysics Data System (ADS)

    Kumar, S. Santosh; Sun, Yaning; Zou, Sige; Hong, Jiarong

    2016-09-01

    Drosophila is an excellent model organism towards understanding the cognitive function, aging and neurodegeneration in humans. The effects of aging and other long-term dynamics on the behavior serve as important biomarkers in identifying such changes to the brain. In this regard, we are presenting a new imaging technique for lifetime monitoring of Drosophila in 3D at spatial and temporal resolutions capable of resolving the motion of limbs and wings using holographic principles. The developed system is capable of monitoring and extracting various behavioral parameters, such as ethograms and spatial distributions, from a group of flies simultaneously. This technique can image complicated leg and wing motions of flies at a resolution, which allows capturing specific landing responses from the same data set. Overall, this system provides a unique opportunity for high throughput screenings of behavioral changes in 3D over a long term in Drosophila.

  14. Whole-Brain Microscopy Meets In Vivo Neuroimaging: Techniques, Benefits, and Limitations.

    PubMed

    Aswendt, Markus; Schwarz, Martin; Abdelmoula, Walid M; Dijkstra, Jouke; Dedeurwaerdere, Stefanie

    2017-02-01

    Magnetic resonance imaging, positron emission tomography, and optical imaging have emerged as key tools to understand brain function and neurological disorders in preclinical mouse models. They offer the unique advantage of monitoring individual structural and functional changes over time. What remained unsolved until recently was to generate whole-brain microscopy data which can be correlated to the 3D in vivo neuroimaging data. Conventional histological sections are inappropriate especially for neuronal tracing or the unbiased screening for molecular targets through the whole brain. As part of the European Society for Molecular Imaging (ESMI) meeting 2016 in Utrecht, the Netherlands, we addressed this issue in the Molecular Neuroimaging study group meeting. Presentations covered new brain clearing methods, light sheet microscopes for large samples, and automatic registration of microscopy to in vivo imaging data. In this article, we summarize the discussion; give an overview of the novel techniques; and discuss the practical needs, benefits, and limitations.

  15. Authentication of bee pollen grains in bright-field microscopy by combining one-class classification techniques and image processing.

    PubMed

    Chica, Manuel

    2012-11-01

    A novel method for authenticating pollen grains in bright-field microscopic images is presented in this work. The usage of this new method is clear in many application fields such as bee-keeping sector, where laboratory experts need to identify fraudulent bee pollen samples against local known pollen types. Our system is based on image processing and one-class classification to reject unknown pollen grain objects. The latter classification technique allows us to tackle the major difficulty of the problem, the existence of many possible fraudulent pollen types, and the impossibility of modeling all of them. Different one-class classification paradigms are compared to study the most suitable technique for solving the problem. In addition, feature selection algorithms are applied to reduce the complexity and increase the accuracy of the models. For each local pollen type, a one-class classifier is trained and aggregated into a multiclassifier model. This multiclassification scheme combines the output of all the one-class classifiers in a unique final response. The proposed method is validated by authenticating pollen grains belonging to different Spanish bee pollen types. The overall accuracy of the system on classifying fraudulent microscopic pollen grain objects is 92.3%. The system is able to rapidly reject pollen grains, which belong to nonlocal pollen types, reducing the laboratory work and effort. The number of possible applications of this authentication method in the microscopy research field is unlimited. Copyright © 2012 Wiley Periodicals, Inc.

  16. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. A Statistical Description of Neural Ensemble Dynamics

    PubMed Central

    Long, John D.; Carmena, Jose M.

    2011-01-01

    The growing use of multi-channel neural recording techniques in behaving animals has produced rich datasets that hold immense potential for advancing our understanding of how the brain mediates behavior. One limitation of these techniques is they do not provide important information about the underlying anatomical connections among the recorded neurons within an ensemble. Inferring these connections is often intractable because the set of possible interactions grows exponentially with ensemble size. This is a fundamental challenge one confronts when interpreting these data. Unfortunately, the combination of expert knowledge and ensemble data is often insufficient for selecting a unique model of these interactions. Our approach shifts away from modeling the network diagram of the ensemble toward analyzing changes in the dynamics of the ensemble as they relate to behavior. Our contribution consists of adapting techniques from signal processing and Bayesian statistics to track the dynamics of ensemble data on time-scales comparable with behavior. We employ a Bayesian estimator to weigh prior information against the available ensemble data, and use an adaptive quantization technique to aggregate poorly estimated regions of the ensemble data space. Importantly, our method is capable of detecting changes in both the magnitude and structure of correlations among neurons missed by firing rate metrics. We show that this method is scalable across a wide range of time-scales and ensemble sizes. Lastly, the performance of this method on both simulated and real ensemble data is used to demonstrate its utility. PMID:22319486

  18. Assessment of liver steatosis and fibrosis in rats using integrated coherent anti-Stokes Raman scattering and multiphoton imaging technique

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Lu, Fake; Zheng, Wei; Xu, Shuoyu; Tai, Dean; Yu, Hanry; Huang, Zhiwei

    2011-11-01

    We report the implementation of a unique integrated coherent anti-Stokes Raman scattering (CARS), second-harmonic generation (SHG), and two-photon excitation fluorescence (TPEF) microscopy imaging technique developed for label-free monitoring of the progression of liver steatosis and fibrosis generated in a bile duct ligation (BDL) rat model. Among the 21 adult rats used in this study, 18 rats were performed with BDL surgery and sacrificed each week from weeks 1 to 6 (n = 3 per week), respectively; whereas 3 rats as control were sacrificed at week 0. Colocalized imaging of the aggregated hepatic fats, collagen fibrils, and hepatocyte morphologies in liver tissue is realized by using the integrated CARS, SHG, and TPEF technique. The results show that there are significant accumulations of hepatic lipid droplets and collagen fibrils associated with severe hepatocyte necrosis in BDL rat liver as compared to a normal liver tissue. The volume of normal hepatocytes keeps decreasing and the fiber collagen content in BDL rat liver follows a growing trend until week 6; whereas the hepatic fat content reaches a maximum in week 4 and then appears to stop growing in week 6, indicating that liver steatosis and fibrosis induced in a BDL rat liver model may develop at different rates. This work demonstrates that the integrated CARS and multiphoton microscopy imaging technique has the potential to provide an effective means for early diagnosis and detection of liver steatosis and fibrosis without labeling.

  19. Robust Library Building for Autonomous Classification of Downhole Geophysical Logs Using Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2017-03-01

    Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.

  20. Assessment of liver steatosis and fibrosis in rats using integrated coherent anti-Stokes Raman scattering and multiphoton imaging technique.

    PubMed

    Lin, Jian; Lu, Fake; Zheng, Wei; Xu, Shuoyu; Tai, Dean; Yu, Hanry; Huang, Zhiwei

    2011-11-01

    We report the implementation of a unique integrated coherent anti-Stokes Raman scattering (CARS), second-harmonic generation (SHG), and two-photon excitation fluorescence (TPEF) microscopy imaging technique developed for label-free monitoring of the progression of liver steatosis and fibrosis generated in a bile duct ligation (BDL) rat model. Among the 21 adult rats used in this study, 18 rats were performed with BDL surgery and sacrificed each week from weeks 1 to 6 (n = 3 per week), respectively; whereas 3 rats as control were sacrificed at week 0. Colocalized imaging of the aggregated hepatic fats, collagen fibrils, and hepatocyte morphologies in liver tissue is realized by using the integrated CARS, SHG, and TPEF technique. The results show that there are significant accumulations of hepatic lipid droplets and collagen fibrils associated with severe hepatocyte necrosis in BDL rat liver as compared to a normal liver tissue. The volume of normal hepatocytes keeps decreasing and the fiber collagen content in BDL rat liver follows a growing trend until week 6; whereas the hepatic fat content reaches a maximum in week 4 and then appears to stop growing in week 6, indicating that liver steatosis and fibrosis induced in a BDL rat liver model may develop at different rates. This work demonstrates that the integrated CARS and multiphoton microscopy imaging technique has the potential to provide an effective means for early diagnosis and detection of liver steatosis and fibrosis without labeling.

  1. Electrochromic WO[subscript 3] Films: Nanotechnology Experiments in Instrumental Analysis and Physical Chemistry Laboratories

    ERIC Educational Resources Information Center

    Hepel, Maria

    2008-01-01

    This experiment teaches students the methodology of investigating novel properties of materials using new instrumental techniques: atomic force microscopy (AFM), electrochemical quartz crystal nanobalance (EQCN), voltammetric techniques (linear potential scan and chronoamperometry), and light reflectance measurements. The unique capabilities of…

  2. Teaching or Facilitating Learning? Selecting the Optimal Approach for Your Educational Objectives and Audience

    ERIC Educational Resources Information Center

    Wise, Dena

    2017-01-01

    Both teaching and facilitation are effective instructional techniques, but each is appropriate for unique educational objectives and scenarios. This article briefly distinguishes between teaching and facilitative techniques and provides guidelines for choosing the better method for a particular educational scenario.

  3. Nonhydrostatic icosahedral atmospheric model (NICAM) for global cloud resolving simulations

    NASA Astrophysics Data System (ADS)

    Satoh, M.; Matsuno, T.; Tomita, H.; Miura, H.; Nasuno, T.; Iga, S.

    2008-03-01

    A new type of ultra-high resolution atmospheric global circulation model is developed. The new model is designed to perform "cloud resolving simulations" by directly calculating deep convection and meso-scale circulations, which play key roles not only in the tropical circulations but in the global circulations of the atmosphere. Since cores of deep convection have a few km in horizontal size, they have not directly been resolved by existing atmospheric general circulation models (AGCMs). In order to drastically enhance horizontal resolution, a new framework of a global atmospheric model is required; we adopted nonhydrostatic governing equations and icosahedral grids to the new model, and call it Nonhydrostatic ICosahedral Atmospheric Model (NICAM). In this article, we review governing equations and numerical techniques employed, and present the results from the unique 3.5-km mesh global experiments—with O(10 9) computational nodes—using realistic topography and land/ocean surface thermal forcing. The results show realistic behaviors of multi-scale convective systems in the tropics, which have not been captured by AGCMs. We also argue future perspective of the roles of the new model in the next generation atmospheric sciences.

  4. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    NASA Astrophysics Data System (ADS)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  5. An Ontology Based Approach to Information Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard ISO/IEC_JTC1, and their relationships to other concepts, defined in a subset of the information security domain.

  6. Towards new generation spectroscopic models of cool stars

    NASA Astrophysics Data System (ADS)

    Bergemann, Maria

    2018-06-01

    Abstract: Spectroscopy is a unique tool to determine the physical parameters of stars. Knowledge of stellar chemical abundances, masses, and ages is the key to understanding the evolution of their host populations. I will focus on the current outstanding problems in spectroscopy of cool stars, which are the most useful objects in studies of our local Galactic neighborhood but also very distant systems, like faint dwarf Spheroidal galaxies. Among the most debated issues is to what extent can we trust the techniques, which rely on the classical assumptions of local thermodynamic equilibrium and hydrostatic balance. I will summarise the ongoing efforts to improve the models of cool stars, with the emphasis on NLTE and 3D modelling. I will then discuss how these exciting observations impact our knowledge of abundances in the Milky Way and in dSph systems, and present outlook for the future studies.

  7. Influence of Gridded Standoff Measurement Resolution on Numerical Bathymetric Inversion

    NASA Astrophysics Data System (ADS)

    Hesser, T.; Farthing, M. W.; Brodie, K.

    2016-02-01

    The bathymetry from the surfzone to the shoreline incurs frequent, active movement due to wave energy interacting with the seafloor. Methodologies to measure bathymetry range from point-source in-situ instruments, vessel-mounted single-beam or multi-beam sonar surveys, airborne bathymetric lidar, as well as inversion techniques from standoff measurements of wave processes from video or radar imagery. Each type of measurement has unique sources of error and spatial and temporal resolution and availability. Numerical bathymetry estimation frameworks can use these disparate data types in combination with model-based inversion techniques to produce a "best-estimate of bathymetry" at a given time. Understanding how the sources of error and varying spatial or temporal resolution of each data type affect the end result is critical for determining best practices and in turn increase the accuracy of bathymetry estimation techniques. In this work, we consider an initial step in the development of a complete framework for estimating bathymetry in the nearshore by focusing on gridded standoff measurements and in-situ point observations in model-based inversion at the U.S. Army Corps of Engineers Field Research Facility in Duck, NC. The standoff measurement methods return wave parameters computed using linear wave theory from the direct measurements. These gridded datasets can range in temporal and spatial resolution that do not match the desired model parameters and therefore could lead to a reduction in the accuracy of these methods. Specifically, we investigate the affect of numerical resolution on the accuracy of an Ensemble Kalman Filter bathymetric inversion technique in relation to the spatial and temporal resolution of the gridded standoff measurements. The accuracies of the bathymetric estimates are compared with both high-resolution Real Time Kinematic (RTK) single-beam surveys as well as alternative direct in-situ measurements using sonic altimeters.

  8. Scalable graphene production: perspectives and challenges of plasma applications

    NASA Astrophysics Data System (ADS)

    Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth

    2016-05-01

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  9. Scalable graphene production: perspectives and challenges of plasma applications.

    PubMed

    Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth

    2016-05-19

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  10. The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank

    NASA Astrophysics Data System (ADS)

    Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing

    2018-03-01

    In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.

  11. Estimating the Latent Number of Types in Growing Corpora with Reduced Cost-Accuracy Trade-Off

    ERIC Educational Resources Information Center

    Hidaka, Shohei

    2016-01-01

    The number of unique words in children's speech is one of most basic statistics indicating their language development. We may, however, face difficulties when trying to accurately evaluate the number of unique words in a child's growing corpus over time with a limited sample size. This study proposes a novel technique to estimate the latent number…

  12. Assessing non-uniqueness: An algebraic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasco, Don W.

    Geophysical inverse problems are endowed with a rich mathematical structure. When discretized, most differential and integral equations of interest are algebraic (polynomial) in form. Techniques from algebraic geometry and computational algebra provide a means to address questions of existence and uniqueness for both linear and non-linear inverse problem. In a sense, the methods extend ideas which have proven fruitful in treating linear inverse problems.

  13. Early Oscillation Detection Technique for Hybrid DC/DC Converters

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    Oscillation or instability is a situation that must be avoided for reliable hybrid DC/DC converters. A real-time electronics measurement technique was developed to detect catastrophic oscillations at early stages for hybrid DC/DC converters. It is capable of identifying low-level oscillation and determining the degree of the oscillation at a unique frequency for every individual model of the converters without disturbing their normal operations. This technique is specially developed for space-used hybrid DC/DC converters, but it is also suitable for most of commercial and military switching-mode power supplies. This is a weak-electronic-signal detection technique to detect hybrid DC/DC converter oscillation presented as a specific noise signal at power input pins. It is based on principles of feedback control loop oscillation and RF signal modulations, and is realized by using signal power spectral analysis. On the power spectrum, a channel power amplitude at characteristic frequency (CPcf) and a channel power amplitude at switching frequency (CPsw) are chosen as oscillation level indicators. If the converter is stable, the CPcf is a very small pulse and the CPsw is a larger, clear, single pulse. At early stage of oscillation, the CPcf increases to a certain level and the CPsw shows a small pair of sideband pulses around it. If the converter oscillates, the CPcf reaches to a higher level and the CPsw shows more high-level sideband pulses. A comprehensive stability index (CSI) is adopted as a quantitative measure to accurately assign a degree of stability to a specific DC/DC converter. The CSI is a ratio of normal and abnormal power spectral density, and can be calculated using specified and measured CPcf and CPsw data. The novel and unique feature of this technique is the use of power channel amplitudes at characteristic frequency and switching frequency to evaluate stability and identify oscillations at an early stage without interfering with a DC/DC converter s normal operation. This technique eliminates the probing problem of a gain/phase margin method by connecting the power input to a spectral analyzer. Therefore, it is able to evaluate stability for all kinds of hybrid DC/DC converters with or without remote sense pins, and is suitable for real-time and in-circuit testing. This frequency-domain technique is more sensitive to detect oscillation at early stage than the time-domain method using an oscilloscope.

  14. Psychological aspects of human cloning and genetic manipulation: the identity and uniqueness of human beings.

    PubMed

    Morales, N M

    2009-01-01

    Human cloning has become one of the most controversial debates about reproduction in Western civilization. Human cloning represents asexual reproduction, but the critics of human cloning argue that the result of cloning is not a new individual who is genetically unique. There is also awareness in the scientific community, including the medical community, that human cloning and the creation of clones are inevitable. Psychology and other social sciences, together with the natural sciences, will need to find ways to help the healthcare system, to be prepared to face the new challenges introduced by the techniques of human cloning. One of those challenges is to help the healthcare system to find specific standards of behaviour that could be used to help potential parents to interact properly with cloned babies or children created through genetic manipulation. In this paper, the concepts of personality, identity and uniqueness are discussed in relationship to the contribution of twin studies in these areas. The author argues that an individual created by human cloning techniques or any other type of genetic manipulation will not show the donor's characteristics to the extent of compromising uniqueness. Therefore, claims to such an effect are needlessly alarmist.

  15. Appealing to Good Students in Introductory Economics.

    ERIC Educational Resources Information Center

    Jensen, Elizabeth J.; Owen, Ann L.

    2003-01-01

    Examines effective teaching techniques using a unique data set that allows matching student and instructor characteristics to assess impact on student interest in economics. Finds devoting more time to discussion is effective but varies by type of student. Determines that a using many teaching techniques appeals to learning styles adopted by good…

  16. Establishing Benchmarks for Outcome Indicators: A Statistical Approach to Developing Performance Standards.

    ERIC Educational Resources Information Center

    Henry, Gary T.; And Others

    1992-01-01

    A statistical technique is presented for developing performance standards based on benchmark groups. The benchmark groups are selected using a multivariate technique that relies on a squared Euclidean distance method. For each observation unit (a school district in the example), a unique comparison group is selected. (SLD)

  17. Some recent developments in headspace gas chromatography

    Treesearch

    J.Y. Zhu; X.-S. Chai

    2005-01-01

    In this study, recent developments in headspace gas chromatography (HSGC) are briefly reviewed. Several novel HSGC techniques developed recently are presented in detail. These techniques were developed using the unique characteristics of the headspace sampling process implemented in commercial HSGC systems and therefore can be easily applied in laboratory and...

  18. Epidural volume extension: A novel technique and its efficacy in high risk cases

    PubMed Central

    Tiwari, Akhilesh Kumar; Singh, Rajeev Ratan; Anupam, Rudra Pratap; Ganguly, S.; Tomar, Gaurav Singh

    2012-01-01

    We present a unique case series restricting ourselves only to the high-risk case of different specialities who underwent successful surgery in our Institute by using epidural volume extension's technique using 1 mL of 0.5% ropivacaine and 25 μg of fentanyl. PMID:25885627

  19. Methodological Ambiguities of the Projective Technique: An Overview and Attempt to Clarify.

    ERIC Educational Resources Information Center

    Veiel, H.; Coles, E. M.

    1982-01-01

    Definitions of projective tests are critiqued. A distinction is made between projective tests and projective techniques. The unique feature of the latter is its scoring process: response categories are intensionally defined and comprise infinite sets of responses. A continuity from psychometric to projective tests is argued. Statistical…

  20. A New Femtosecond Laser-Based Three-Dimensional Tomography Technique

    NASA Astrophysics Data System (ADS)

    Echlin, McLean P.

    2011-12-01

    Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri-Beam system combines the benefits of laser based material removal (speed, low-damage, automated) with detectors that collect chemical, structural, and topological information. Multi-modal sectioning information was collected after many laser scanning passes demonstrating the capability of the Tri-Beam system.

  1. Pseudorandom Noise Code-Based Technique for Thin Cloud Discrimination with CO2 and O2 Absorption Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a pseudo noise (PN) code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths.

  2. Using structural equation modeling for network meta-analysis.

    PubMed

    Tu, Yu-Kang; Wu, Yun-Chun

    2017-07-14

    Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.

  3. Electronic implementation of associative memory based on neural network models

    NASA Technical Reports Server (NTRS)

    Moopenn, A.; Lambe, John; Thakoor, A. P.

    1987-01-01

    An electronic embodiment of a neural network based associative memory in the form of a binary connection matrix is described. The nature of false memory errors, their effect on the information storage capacity of binary connection matrix memories, and a novel technique to eliminate such errors with the help of asymmetrical extra connections are discussed. The stability of the matrix memory system incorporating a unique local inhibition scheme is analyzed in terms of local minimization of an energy function. The memory's stability, dynamic behavior, and recall capability are investigated using a 32-'neuron' electronic neural network memory with a 1024-programmable binary connection matrix.

  4. The Fifth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Fifth Annual Thermal and Fluids Analysis Workshop was held at the Ohio Aerospace Institute, Brook Park, Ohio, cosponsored by NASA Lewis Research Center and the Ohio Aerospace Institute, 16-20 Aug. 1993. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluid analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  5. Eddy current inspection of graphite fiber components

    NASA Technical Reports Server (NTRS)

    Workman, G. L.; Bryson, C. C.

    1990-01-01

    The recognition of defects in materials properties still presents a number of problems for nondestructive testing in aerospace systems. This project attempts to utilize current capabilities in eddy current instrumentation, artificial intelligence, and robotics in order to provide insight into defining geometrical aspects of flaws in composite materials which are capable of being evaluated using eddy current inspection techniques. The unique capabilities of E-probes and horseshoe probes for inspecting probes for inspecting graphite fiber materials were evaluated and appear to hold great promise once the technology development matures. The initial results are described of modeling eddy current interactions with certain flaws in graphite fiber samples.

  6. Computer aided flexible envelope designs

    NASA Technical Reports Server (NTRS)

    Resch, R. D.

    1975-01-01

    Computer aided design methods are presented for the design and construction of strong, lightweight structures which require complex and precise geometric definition. The first, flexible structures, is a unique system of modeling folded plate structures and space frames. It is possible to continuously vary the geometry of a space frame to produce large, clear spans with curvature. The second method deals with developable surfaces, where both folding and bending are explored with the observed constraint of available building materials, and what minimal distortion result in maximum design capability. Alternative inexpensive fabrication techniques are being developed to achieve computer defined enclosures which are extremely lightweight and mathematically highly precise.

  7. Advanced Computational Methods for Thermal Radiative Heat Transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weaponmore » resp onse in fire environments.« less

  8. Modeling spatially localized photonic nanojets from phase diffraction gratings

    NASA Astrophysics Data System (ADS)

    Geints, Yu. E.; Zemlyanov, A. A.

    2016-04-01

    We investigated numerically the specific spatially localized intense optical structure, a photonic nanojet (PNJ), formed in the near-field scattering of optical radiation at phase diffraction gratings. The finite-difference time-domain technique was employed to study the PNJ key parameters (length, width, focal distance, and intensity) produced by diffraction gratings with the saw-tooth, rectangle, and hemispheric line profiles. Our analysis showed that each type of diffraction gratings produces a photonic jet with unique characteristics. Based on the numerical calculations, we demonstrate that the PNJ could be manipulated in a wide range through the variation of period, duty cycle, and shape of diffraction grating rulings.

  9. Fermilab muon g-2 experiment

    NASA Astrophysics Data System (ADS)

    Gorringe, Tim

    2018-05-01

    The Fermilab muon g-2 experiment will measure the muon anomalous magnetic moment aμ to 140 ppb - a four-fold improvement over the earlier Brookhaven experiment. The measurement of aμ is well known as a unique test of the standard model with broad sensitivity to new interactions, particles and phenomena. The goal of 140 ppb is commensurate with ongoing improvements in the SM prediction of the anomalous moment and addresses the longstanding 3.5σ discrepancy between the BNL result and the SM prediction. In this article I discuss the physics motivation and experimental technique for measuring aμ, and the current status and the future work for the project.

  10. Experimental light scattering by ultrasonically controlled small particles - Implications for Planetary Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Penttilä, A.; Maconi, G.; Kassamakov, I.; Markkanen, J.; Martikainen, J.; Väisänen, T.; Helander, P.; Puranen, T.; Salmi, A.; Hæggström, E.; Muinonen, K.

    2017-09-01

    We present the results obtained with our newly developed 3D scatterometer - a setup for precise multi-angular measurements of light scattered by mm- to µm-sized samples held in place by sound. These measurements are cross-validated against the modeled light-scattering characteristics of the sample, i.e., the intensity and the degree of linear polarization of the reflected light, calculated with state-of-the-art electromagnetic techniques. We demonstrate a unique non-destructive approach to derive the optical properties of small grain samples which facilitates research on highly valuable planetary materials, such as samples returned from space missions or rare meteorites.

  11. Facilitating and securing offline e-medicine service through image steganography

    PubMed Central

    Islam, M. Mahfuzul

    2014-01-01

    E-medicine is a process to provide health care services to people using the Internet or any networking technology. In this Letter, a new idea is proposed to model the physical structure of the e-medicine system to better provide offline health care services. Smart cards are used to authenticate the user singly. A very unique technique is also suggested to verify the card owner's identity and to embed secret data to the card while providing patients' reports either at booths or at the e-medicine server system. The simulation results of card authentication and embedding procedure justify the proposed implementation. PMID:26609382

  12. A reliable facility location design model with site-dependent disruption in the imperfect information context

    PubMed Central

    Yun, Lifen; Wang, Xifu; Fan, Hongqiang; Li, Xiaopeng

    2017-01-01

    This paper proposes a reliable facility location design model under imperfect information with site-dependent disruptions; i.e., each facility is subject to a unique disruption probability that varies across the space. In the imperfect information contexts, customers adopt a realistic “trial-and-error” strategy to visit facilities; i.e., they visit a number of pre-assigned facilities sequentially until they arrive at the first operational facility or give up looking for the service. This proposed model aims to balance initial facility investment and expected long-term operational cost by finding the optimal facility locations. A nonlinear integer programming model is proposed to describe this problem. We apply a linearization technique to reduce the difficulty of solving the proposed model. A number of problem instances are studied to illustrate the performance of the proposed model. The results indicate that our proposed model can reveal a number of interesting insights into the facility location design with site-dependent disruptions, including the benefit of backup facilities and system robustness against variation of the loss-of-service penalty. PMID:28486564

  13. Animal models for rotator cuff repair.

    PubMed

    Lebaschi, Amir; Deng, Xiang-Hua; Zong, Jianchun; Cong, Guang-Ting; Carballo, Camila B; Album, Zoe M; Camp, Christopher; Rodeo, Scott A

    2016-11-01

    Rotator cuff (RC) injuries represent a significant source of pain, functional impairment, and morbidity. The large disease burden of RC pathologies necessitates rapid development of research methodologies to treat these conditions. Given their ability to model anatomic, biomechanical, cellular, and molecular aspects of the human RC, animal models have played an indispensable role in reducing injury burden and advancing this field of research for many years. The development of animal models in the musculoskeletal (MSK) research arena is uniquely different from that in other fields in that the similarity of macrostructures and functions is as critical to replicate as cellular and molecular functions. Traditionally, larger animals have been used because of their anatomic similarity to humans and the ease of carrying out realistic surgical procedures. However, refinement of current molecular methods, introduction of novel research tools, and advancements in microsurgical techniques have increased the applicability of small animal models in MSK research. In this paper, we review RC animal models and emphasize a murine model that may serve as a valuable instrument for future RC tendon repair investigations. © 2016 New York Academy of Sciences.

  14. Inverse estimation of parameters for an estuarine eutrophication model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, J.; Kuo, A.Y.

    1996-11-01

    An inverse model of an estuarine eutrophication model with eight state variables is developed. It provides a framework to estimate parameter values of the eutrophication model by assimilation of concentration data of these state variables. The inverse model using the variational technique in conjunction with a vertical two-dimensional eutrophication model is general enough to be applicable to aid model calibration. The formulation is illustrated by conducting a series of numerical experiments for the tidal Rappahannock River, a western shore tributary of the Chesapeake Bay. The numerical experiments of short-period model simulations with different hypothetical data sets and long-period model simulationsmore » with limited hypothetical data sets demonstrated that the inverse model can be satisfactorily used to estimate parameter values of the eutrophication model. The experiments also showed that the inverse model is useful to address some important questions, such as uniqueness of the parameter estimation and data requirements for model calibration. Because of the complexity of the eutrophication system, degrading of speed of convergence may occur. Two major factors which cause degradation of speed of convergence are cross effects among parameters and the multiple scales involved in the parameter system.« less

  15. Counterconformity: An Attribution Model of Adolescents' Uniqueness-Seeking Behaviors in Dressing

    ERIC Educational Resources Information Center

    Ling, I-Ling

    2008-01-01

    This article explores how an attribution model will illustrate uniqueness-seeking behavior in dressing in the Taiwanese adolescent subculture. The study employed 443 senior high school students. Results show that the tendency of uniqueness-seeking behavior in dressing is moderate. However, using cluster analysis to segment the counterconformity…

  16. ITRF2014 Evaluation with ILRS Data and Products

    NASA Astrophysics Data System (ADS)

    Pavlis, E. C.; Luceri, V.; Kuzmicz-Cieslak, M.; König, D.; Bianco, G.

    2015-12-01

    The development and release of the new realization of the International Terrestrial Reference Frame—ITRF2014 requires elaborate testing to ensure the quality of the final product. The evaluation effort ensures that the ITRF is of the indicated quality by its error estimates and the combination has not compromised the contributing techniques' input. The International Laser Ranging Service (ILRS) contributes unique information that only Satellite Laser Ranging—SLR is sensitive to: the definition of the origin, and in equal parts with VLBI, the scale of the model. The ILRS analysts adopted a revision of the internal standards and procedures in developing our contribution to ITRF2014 from our eight Analysis Centers. Anticipating the release of ITRF2014 we worked on designing and executing tests using data and products unique to ILRS. In addition to the data contributed to ITRF2014, ILRS has several other targets, in lower and higher orbits, the SLR tracking data of which are used as independent data for the evaluation process. Since SLR data are primarily sensitive to the origin and scale definition of the TRF model, these model attributes are the best to be validated using SLR data. LAGEOS and ETALON data collected outside the span of data used in ITRF2014 can also evaluate the quality of the estimated velocity vectors. The use of independent SLR data evaluates the model throughout the period that such data are available. SLR data from low altitude missions can validate the performance of the model from the late '70s all the way to present (using e.g. STARLETTE and LARES data). This presentation will give an overview of the new model's evaluation using exclusively ILRS tracking data and other ILRS products.

  17. Multiscale Characterization of the Probability Density Functions of Velocity and Temperature Increment Fields

    NASA Astrophysics Data System (ADS)

    DeMarco, Adam Ward

    The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This novel approach can provide a method of characterizing increment fields with the sole use of only four pdf parameters. Also, we investigate the capability of the current state-of-the-art mesoscale atmospheric models to predict the features and highlight the potential for use for future model development. With the knowledge gained in this study, a number of applications can benefit by using our methodology, including the wind energy and optical wave propagation fields.

  18. ASTM clustering for improving coal analysis by near-infrared spectroscopy.

    PubMed

    Andrés, J M; Bona, M T

    2006-11-15

    Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.

  19. Nondestructive Evaluation of Metal Fatigue Using Nonlinear Acoustics

    NASA Technical Reports Server (NTRS)

    Cantrell, John H., Jr.

    2008-01-01

    Safe-life and damage-tolerant design philosophies of high performance structures have driven the development of various methods to evaluate nondestructively the accumulation of damage in such structures resulting from cyclic loading. Although many techniques have proven useful, none has been able to provide an unambiguous, quantitative assessment of damage accumulation at each stage of fatigue from the virgin state to fracture. A method based on nonlinear acoustics is shown to provide such a means to assess the state of metal fatigue. The salient features of an analytical model are presented of the microelastic-plastic nonlinearities resulting from the interaction of an acoustic wave with fatigue-generated dislocation substructures and cracks that predictably evolve during the metal fatigue process. The interaction is quantified by the material (acoustic) nonlinearity parameter extracted from acoustic harmonic generation measurements. The parameters typically increase monotonically by several hundred percent over the fatigue life of the metal, thus providing a unique measure of the state of fatigue. Application of the model to aluminum alloy 2024-T4, 410Cb stainless steel, and IN100 nickel-base superalloy specimens fatigued using different loading conditions yields good agreement between theory and experiment. Application of the model and measurement technique to the on-site inspection of steam turbine blades is discussed.

  20. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  1. Mapping the ecological dimensions and potential distributions of endangered relic shrubs in western Ordos biodiversity center

    PubMed Central

    Zhu, Geng-Ping; Li, Hui-Qi; Zhao, Li; Man, Liang; Liu, Qiang

    2016-01-01

    Potential distributions of endemic relic shrubs in western Ordos were poorly mapped, which hindered our implementation of proper conservation. Here we investigated the applicability of ecological niche modeling for endangered relic shrubs to detect areas of priority for biodiversity conservation and analyze differences in ecological niche spaces used by relic shrubs. We applied ordination and niche modeling techniques to assess main environmental drivers of five endemic relic shrubs in western Ordos, namely, Ammopiptanthus mongolicus, Amygdalus mongolica, Helianthemum songaricum, Potaninia mongolica, and Tetraena mongolica. We calculated niche overlap metrics in gridded environmental spaces and compared geographical projections of ecological niches to determine similarities and differences of niches occupied by relic shrubs. All studied taxa presented different responses to environmental factors, which resulted in a unique combination of niche conditions. Precipitation availability and soil quality characteristics play important roles in the distributions of most shrubs. Each relic shrub is constrained by a unique set of environmental conditions, the distribution of one species cannot be implied by the distribution of another, highlighting the inadequacy of one-fits-all type of conservation measure. Our stacked habitat suitability maps revealed regions around Yellow River, which are highly suitable for most species, thereby providing high conservation value. PMID:27199260

  2. Mapping the ecological dimensions and potential distributions of endangered relic shrubs in western Ordos biodiversity center.

    PubMed

    Zhu, Geng-Ping; Li, Hui-Qi; Zhao, Li; Man, Liang; Liu, Qiang

    2016-05-20

    Potential distributions of endemic relic shrubs in western Ordos were poorly mapped, which hindered our implementation of proper conservation. Here we investigated the applicability of ecological niche modeling for endangered relic shrubs to detect areas of priority for biodiversity conservation and analyze differences in ecological niche spaces used by relic shrubs. We applied ordination and niche modeling techniques to assess main environmental drivers of five endemic relic shrubs in western Ordos, namely, Ammopiptanthus mongolicus, Amygdalus mongolica, Helianthemum songaricum, Potaninia mongolica, and Tetraena mongolica. We calculated niche overlap metrics in gridded environmental spaces and compared geographical projections of ecological niches to determine similarities and differences of niches occupied by relic shrubs. All studied taxa presented different responses to environmental factors, which resulted in a unique combination of niche conditions. Precipitation availability and soil quality characteristics play important roles in the distributions of most shrubs. Each relic shrub is constrained by a unique set of environmental conditions, the distribution of one species cannot be implied by the distribution of another, highlighting the inadequacy of one-fits-all type of conservation measure. Our stacked habitat suitability maps revealed regions around Yellow River, which are highly suitable for most species, thereby providing high conservation value.

  3. A Parameter Subset Selection Algorithm for Mixed-Effects Models

    DOE PAGES

    Schmidt, Kathleen L.; Smith, Ralph C.

    2016-01-01

    Mixed-effects models are commonly used to statistically model phenomena that include attributes associated with a population or general underlying mechanism as well as effects specific to individuals or components of the general mechanism. This can include individual effects associated with data from multiple experiments. However, the parameterizations used to incorporate the population and individual effects are often unidentifiable in the sense that parameters are not uniquely specified by the data. As a result, the current literature focuses on model selection, by which insensitive parameters are fixed or removed from the model. Model selection methods that employ information criteria are applicablemore » to both linear and nonlinear mixed-effects models, but such techniques are limited in that they are computationally prohibitive for large problems due to the number of possible models that must be tested. To limit the scope of possible models for model selection via information criteria, we introduce a parameter subset selection (PSS) algorithm for mixed-effects models, which orders the parameters by their significance. In conclusion, we provide examples to verify the effectiveness of the PSS algorithm and to test the performance of mixed-effects model selection that makes use of parameter subset selection.« less

  4. Virtual reality neurosurgery: a simulator blueprint.

    PubMed

    Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J

    2004-04-01

    This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.

  5. Introducing quality improvement methods into local public health departments: structured evaluation of a statewide pilot project.

    PubMed

    Riley, William; Parsons, Helen; McCoy, Kim; Burns, Debra; Anderson, Donna; Lee, Suhna; Sainfort, François

    2009-10-01

    To test the feasibility and assess the preliminary impact of a unique statewide quality improvement (QI) training program designed for public health departments. One hundred and ninety-five public health employees/managers from 38 local health departments throughout Minnesota were selected to participate in a newly developed QI training program and 65 of those engaged in and completed eight expert-supported QI projects over a period of 10 months from June 2007 through March 2008. As part of the Minnesota Quality Improvement Initiative, a structured distance education QI training program was designed and deployed in a first large-scale pilot. To evaluate the preliminary impact of the program, a mixed-method evaluation design was used based on four dimensions: learner reaction, knowledge, intention to apply, and preliminary outcomes. Subjective ratings of three dimensions of training quality were collected from participants after each of the scheduled learning sessions. Pre- and post-QI project surveys were administered to collect participant reactions, knowledge, future intention to apply learning, and perceived outcomes. Monthly and final QI project reports were collected to further inform success and preliminary outcomes of the projects. The participants reported (1) high levels of satisfaction with the training sessions, (2) increased perception of the relevance of the QI techniques, (3) increased perceived knowledge of all specific QI methods and techniques, (4) increased confidence in applying QI techniques on future projects, (5) increased intention to apply techniques on future QI projects, and (6) high perceived success of, and satisfaction with, the projects. Finally, preliminary outcomes data show moderate to large improvements in quality and/or efficiency for six out of eight projects. QI methods and techniques can be successfully implemented in local public health agencies on a statewide basis using the collaborative model through distance training and expert facilitation. This unique training can improve both core and support processes and lead to favorable staff reactions, increased knowledge, and improved health outcomes. The program can be further improved and deployed and holds great promise to facilitate the successful dissemination of proven QI methods throughout local public health departments.

  6. Regional scale hydrology with a new land surface processes model

    NASA Technical Reports Server (NTRS)

    Laymon, Charles; Crosson, William

    1995-01-01

    Through the CaPE Hydrometeorology Project, we have developed an understanding of some of the unique data quality issues involved in assimilating data of disparate types for regional-scale hydrologic modeling within a GIS framework. Among others, the issues addressed here include the development of adequate validation of the surface water budget, implementation of the STATSGO soil data set, and implementation of a remote sensing-derived landcover data set to account for surface heterogeneity. A model of land surface processes has been developed and used in studies of the sensitivity of surface fluxes and runoff to soil and landcover characterization. Results of these experiments have raised many questions about how to treat the scale-dependence of land surface-atmosphere interactions on spatial and temporal variability. In light of these questions, additional modifications are being considered for the Marshall Land Surface Processes Model. It is anticipated that these techniques can be tested and applied in conjunction with GCIP activities over regional scales.

  7. Merging spatially variant physical process models under an optimized systems dynamics framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, William O.; Lowry, Thomas Stephen; Pierce, Suzanne A.

    The complexity of water resource issues, its interconnectedness to other systems, and the involvement of competing stakeholders often overwhelm decision-makers and inhibit the creation of clear management strategies. While a range of modeling tools and procedures exist to address these problems, they tend to be case specific and generally emphasize either a quantitative and overly analytic approach or present a qualitative dialogue-based approach lacking the ability to fully explore consequences of different policy decisions. The integration of these two approaches is needed to drive toward final decisions and engender effective outcomes. Given these limitations, the Computer Assisted Dispute Resolution systemmore » (CADRe) was developed to aid in stakeholder inclusive resource planning. This modeling and negotiation system uniquely addresses resource concerns by developing a spatially varying system dynamics model as well as innovative global optimization search techniques to maximize outcomes from participatory dialogues. Ultimately, the core system architecture of CADRe also serves as the cornerstone upon which key scientific innovation and challenges can be addressed.« less

  8. Induced Pluripotent Stem Cells for Disease Modeling and Drug Discovery in Neurodegenerative Diseases.

    PubMed

    Cao, Lei; Tan, Lan; Jiang, Teng; Zhu, Xi-Chen; Yu, Jin-Tai

    2015-08-01

    Although most neurodegenerative diseases have been closely related to aberrant accumulation of aggregation-prone proteins in neurons, understanding their pathogenesis remains incomplete, and there is no treatment to delay the onset or slow the progression of many neurodegenerative diseases. The availability of induced pluripotent stem cells (iPSCs) in recapitulating the phenotypes of several late-onset neurodegenerative diseases marks the new era in in vitro modeling. The iPSC collection represents a unique and well-characterized resource to elucidate disease mechanisms in these diseases and provides a novel human stem cell platform for screening new candidate therapeutics. Modeling human diseases using iPSCs has created novel opportunities for both mechanistic studies as well as for the discovery of new disease therapies. In this review, we introduce iPSC-based disease modeling in neurodegenerative diseases, such as Alzheimer's disease, Parkinson's disease, Huntington's disease, and amyotrophic lateral sclerosis. In addition, we discuss the implementation of iPSCs in drug discovery associated with some new techniques.

  9. Data-driven modeling of background and mine-related acidity and metals in river basins

    USGS Publications Warehouse

    Friedel, Michael J

    2013-01-01

    A novel application of self-organizing map (SOM) and multivariate statistical techniques is used to model the nonlinear interaction among basin mineral-resources, mining activity, and surface-water quality. First, the SOM is trained using sparse measurements from 228 sample sites in the Animas River Basin, Colorado. The model performance is validated by comparing stochastic predictions of basin-alteration assemblages and mining activity at 104 independent sites. The SOM correctly predicts (>98%) the predominant type of basin hydrothermal alteration and presence (or absence) of mining activity. Second, application of the Davies–Bouldin criteria to k-means clustering of SOM neurons identified ten unique environmental groups. Median statistics of these groups define a nonlinear water-quality response along the spatiotemporal hydrothermal alteration-mining gradient. These results reveal that it is possible to differentiate among the continuum between inputs of background and mine-related acidity and metals, and it provides a basis for future research and empirical model development.

  10. TOPICAL REVIEW: Modelling the interaction of electromagnetic fields (10 MHz 10 GHz) with the human body: methods and applications

    NASA Astrophysics Data System (ADS)

    Hand, J. W.

    2008-08-01

    Numerical modelling of the interaction between electromagnetic fields (EMFs) and the dielectrically inhomogeneous human body provides a unique way of assessing the resulting spatial distributions of internal electric fields, currents and rate of energy deposition. Knowledge of these parameters is of importance in understanding such interactions and is a prerequisite when assessing EMF exposure or when assessing or optimizing therapeutic or diagnostic medical applications that employ EMFs. In this review, computational methods that provide this information through full time-dependent solutions of Maxwell's equations are summarized briefly. This is followed by an overview of safety- and medical-related applications where modelling has contributed significantly to development and understanding of the techniques involved. In particular, applications in the areas of mobile communications, magnetic resonance imaging, hyperthermal therapy and microwave radiometry are highlighted. Finally, examples of modelling the potentially new medical applications of recent technologies such as ultra-wideband microwaves are discussed.

  11. On the uniqueness of measuring elastoplastic properties from indentation: The indistinguishable mystical materials

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Ogasawara, Nagahisa; Zhao, Manhong; Chiba, Norimasa

    2007-08-01

    Indentation is widely used to extract material elastoplastic properties from the measured force-displacement curves. One of the most well-established indentation techniques utilizes dual (or plural) sharp indenters (which have different apex angles) to deduce key parameters such as the elastic modulus, yield stress, and work-hardening exponent for materials that obey the power-law constitutive relationship. However, the uniqueness of such analysis is not yet systematically studied or challenged. Here we show the existence of "mystical materials", which have distinct elastoplastic properties yet they yield almost identical indentation behaviors, even when the indenter angle is varied in a large range. These mystical materials are, therefore, indistinguishable by many existing indentation analyses unless extreme (and often impractical) indenter angles are used. Explicit procedures of deriving these mystical materials are established, and the general characteristics of the mystical materials are discussed. In many cases, for a given indenter angle range, a material would have infinite numbers of mystical siblings, and the existence maps of the mystical materials are also obtained. Furthermore, we propose two alternative techniques to effectively distinguish these mystical materials. The study in this paper addresses the important question of the uniqueness of indentation test, as well as providing useful guidelines to properly use the indentation technique to measure material elastoplastic properties.

  12. The Pursuit of a Scalable Nanofabrication Platform for Use in Material and Life Science Applications

    PubMed Central

    GRATTON, STEPHANIE E. A.; WILLIAMS, STUART S.; NAPIER, MARY E.; POHLHAUS, PATRICK D.; ZHOU, ZHILIAN; WILES, KENTON B.; MAYNOR, BENJAMIN W.; SHEN, CLIFTON; OLAFSEN, TOVE; SAMULSKI, EDWARD T.; DESIMONE, JOSEPH M.

    2008-01-01

    CONSPECTUS In this Account, we describe the use of perfluoropolyether (PFPE)-based materials that are able to accurately mold and replicate micro- and nanosized features using traditional techniques such as embossing as well as new techniques that we developed to exploit the exceptional surface characteristics of fluorinated substrates. Because of the unique partial wetting and nonwetting characteristics of PFPEs, we were able to go beyond the usual molding and imprint lithography approaches and have created a technique called PRINT (Particle [or Pattern] Replication In Nonwetting Templates). PRINT is a distinctive “top-down” fabrication technique capable of generating isolated particles, arrays of particles, and arrays of patterned features for a plethora of applications in both nanomedicine and materials science. A particular strength of the PRINT technology is the high-resolution molding of well-defined particles with precise control over size, shape, deformability, and surface chemistry. The level of replication obtained showcases some of the unique characteristics of PFPE molding materials. In particular, these materials arise from very low surface energy precursors with positive spreading coefficients, can be photocured at ambient temperature, and are minimally adhesive, nonswelling, and conformable. These distinctive features enable the molding of materials with unique attributes and nanometer resolution that have unprecedented scientific and technological value. For example, in nanomedicine, the use of PFPE materials with the PRINT technique allows us to design particles in which we can tailor key therapeutic parameters such as bioavailability, biodistribution, target-specific cell penetration, and controlled cargo release. Similarly, in materials science, we can fabricate optical films and lens arrays, replicate complex, naturally occurring objects such as adenovirus particles, and create 2D patterned arrays of inorganic oxides. PMID:18720952

  13. A unique method of retention for gum stripper- a case report.

    PubMed

    Doddamani, Santosh S; T S, Priyanka

    2014-12-01

    Successful restoration of partially edentulous situations, especially kennedy's class-I, II &IV requires lot of contemporary and conventional treatment approaches. Semi precision attachments play a major role in retention of clinically challenging partially edentulous situation. Attachment retained partial dentures can be one of the successful treatment option in prosthdontics. This article presents a unique technique of retaining gum stripper using semi precision attachments.

  14. Differential Topic Models.

    PubMed

    Chen, Changyou; Buntine, Wray; Ding, Nan; Xie, Lexing; Du, Lan

    2015-02-01

    In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For this we use hierarchical Bayesian nonparametric models. Moreover, we found it was important to properly model power-law phenomena in topic-word distributions and thus we used the full Pitman-Yor process rather than just a Dirichlet process. Furthermore, we propose the transformed Pitman-Yor process (TPYP) to incorporate prior knowledge such as vocabulary variations in different collections into the model. To deal with the non-conjugate issue between model prior and likelihood in the TPYP, we thus propose an efficient sampling algorithm using a data augmentation technique based on the multinomial theorem. Experimental results show the model discovers interesting aspects of different collections. We also show the proposed MCMC based algorithm achieves a dramatically reduced test perplexity compared to some existing topic models. Finally, we show our model outperforms the state-of-the-art for document classification/ideology prediction on a number of text collections.

  15. Preclinical Magnetic Resonance Imaging and Spectroscopy Studies of Memory, Aging, and Cognitive Decline

    PubMed Central

    Febo, Marcelo; Foster, Thomas C.

    2016-01-01

    Neuroimaging provides for non-invasive evaluation of brain structure and activity and has been employed to suggest possible mechanisms for cognitive aging in humans. However, these imaging procedures have limits in terms of defining cellular and molecular mechanisms. In contrast, investigations of cognitive aging in animal models have mostly utilized techniques that have offered insight on synaptic, cellular, genetic, and epigenetic mechanisms affecting memory. Studies employing magnetic resonance imaging and spectroscopy (MRI and MRS, respectively) in animal models have emerged as an integrative set of techniques bridging localized cellular/molecular phenomenon and broader in vivo neural network alterations. MRI methods are remarkably suited to longitudinal tracking of cognitive function over extended periods permitting examination of the trajectory of structural or activity related changes. Combined with molecular and electrophysiological tools to selectively drive activity within specific brain regions, recent studies have begun to unlock the meaning of fMRI signals in terms of the role of neural plasticity and types of neural activity that generate the signals. The techniques provide a unique opportunity to causally determine how memory-relevant synaptic activity is processed and how memories may be distributed or reconsolidated over time. The present review summarizes research employing animal MRI and MRS in the study of brain function, structure, and biochemistry, with a particular focus on age-related cognitive decline. PMID:27468264

  16. Active Correction of Aperture Discontinuities-Optimized Stroke Minimization. II. Optimization for Future Missions

    NASA Astrophysics Data System (ADS)

    Mazoyer, J.; Pueyo, L.; N'Diaye, M.; Fogarty, K.; Zimmerman, N.; Soummer, R.; Shaklan, S.; Norman, C.

    2018-01-01

    High-contrast imaging and spectroscopy provide unique constraints for exoplanet formation models as well as for planetary atmosphere models. Instrumentation techniques in this field have greatly improved over the last two decades, with the development of stellar coronagraphy, in parallel with specific methods of wavefront sensing and control. Next generation space- and ground-based telescopes will enable the characterization of cold solar-system-like planets for the first time and maybe even in situ detection of bio-markers. However, the growth of primary mirror diameters, necessary for these detections, comes with an increase of their complexity (segmentation, secondary mirror features). These discontinuities in the aperture can greatly limit the performance of coronagraphic instruments. In this context, we introduced a new technique, Active Correction of Aperture Discontinuities-Optimized Stroke Minimization (ACAD-OSM), to correct for the diffractive effects of aperture discontinuities in the final image plane of a coronagraph, using deformable mirrors. In this paper, we present several tools that can be used to optimize the performance of this technique for its application to future large missions. In particular, we analyzed the influence of the deformable setup (size and separating distance) and found that there is an optimal point for this setup, optimizing the performance of the instrument in contrast and throughput while minimizing the strokes applied to the deformable mirrors. These results will help us design future coronagraphic instruments to obtain the best performance.

  17. Science and Technology of Nanostructured Magnetic Materials

    DTIC Science & Technology

    1990-07-06

    galvano-magnetic and magneto-optic effects that can lead to future storage technologies. Ultrafine particles also show interesting and unique properties...areas including thin films, multilayers, disordered systems, ultrafine particles , intermetallic compounds, permanent magnets and magnetic imaging... ultrafine particles , intermetallic compounds, permanent magnets and magnetic imaging techniques. The development of new techniques for materials preparation

  18. Accommodation Strategies of College Students with Disabilities

    ERIC Educational Resources Information Center

    Barnard-Brak, Lucy; Lechtenberger, DeAnn; Lan, William Y.

    2010-01-01

    College students with disabilities develop and utilize strategies to facilitate their learning experiences due to their unique academic needs. Using a semi-structured interview technique to collect data and a technique based in grounded theory to analyze this data, the purpose of this study was to discern the meaning of disclosure for college…

  19. Mindfulness for Singers: The Effects of a Targeted Mindfulness Course on Learning Vocal Technique

    ERIC Educational Resources Information Center

    Czajkowski, Anne-Marie L.; Greasley, Alinka E.

    2015-01-01

    This paper reports the development and implementation of a unique Mindfulness for Singers (MfS) course designed to improve singers' vocal technique. Eight university students completed the intervention. Five Facet Mindfulness Questionnaire (FFMQ) scores showed general improvement across all five facets of mindfulness. Qualitative results showed…

  20. Performance Tasks: An Assessment Technique Used at TOSTP.

    ERIC Educational Resources Information Center

    Galway, Janis; Whittington, Andrew

    1984-01-01

    Discusses evolution of task performance assessment technique of the Toronto Office Skills Training Project (TOSTP), a 45-week training program for women from Vietnam, Laos, and Cambodia. Training in office skills, language, and life skills is uniquely integrated in a program designed to enable the women to overcome the obstacles of language…

  1. Techniques for Submitting Successful Proposals for SHAPE America National Conventions

    ERIC Educational Resources Information Center

    Stevens-Smith, Deborah

    2016-01-01

    This article covers the basic components of the submission process before submitting a proposal for the SHAPE America national convention. The article discusses various techniques specific to the process, including the unique discipline areas. Other issues addressed include an understanding of the SHAPE America review process and how it works,…

  2. Nuclear resonance scattering of synchrotron radiation as a unique electronic, structural and thermodynamic probe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alp, E. Ercan; Sturhahn, Wolfgang; Toellner, Thomas S.

    2012-05-09

    Discovery of Moessbauer effect in a nuclear transition was a remarkable development. It revealed how long-lived nuclear states with relatively low energies in the kiloelectron volt (keV) region can be excited without recoil. This new effect had a unique feature involving a coupling between nuclear physics and solid-state physics, both in terms of physics and sociology. Physics coupling originates from the fact that recoilless emission and absorption or resonance is only possible if the requirement that nuclei have to be bound in a lattice with quantized vibrational states is fulfilled, and that the finite electron density on the nucleus couplesmore » to nuclear degrees of freedom leading to hyperfine interactions. thus, Moessbauer spectroscopy allows peering into solid-state effects using unique nuclear transitions. Sociological aspects of this coupling had been equally startling and fruitful. The interaction between diverse scientific communities, who learned to use Moessbauer spectroscopy proved to be very valuable. For example, biologists, geologists, chemists, physics, materials scientists, and archeologists, all sharing a common spectroscopic technique, also learned to appreciate the beauty and intricacies of each other's fields. As a laboratory-based technique, Moessbauer spectroscopy matured by the end of the 1970s. Further exciting developments took place when accelerator-based techniques were employed, like synchrotron radiation or 'in-beam'Moessbauer experiments with implanted radioactive ions. More recently, two Moessbauer spectrometers on the surface of the Mars kept the technique vibrant and viable up until present time. In this chapter, the authors look into some of the unique aspects of nuclear resonance excited with synchrotron radiation as a probe of condensed matter, including magnetism, valence, vibrations, and lattice dynamics, and review the development of nuclear resonance inelastic x-ray scattering (NRIXS) and synchrotron Moessbauer spectroscopy (SMS). However, to place these two techniques into some perspective with respect to other methods that yield related information, they display their version of a frequently used map of momentum and energy transfer diagram in figure 17.1. Here, various probes like electrons, neutrons, or light, i.e., Brillouin or Raman, and relatively newer forms of X-ray scattering are placed according to their range of energy and momentum transfer taking place during the measurements. Accordingly, NRIXS is a method that needs to be considered as a complementary probe to inelastic neutron and X-ray scattering, while SMS occupies a unique space due to its sensitivity to magnetism, structural deformations, valence, and spin states.« less

  3. An integrative machine learning strategy for improved prediction of essential genes in Escherichia coli metabolism using flux-coupled features.

    PubMed

    Nandi, Sutanu; Subramanian, Abhishek; Sarkar, Ram Rup

    2017-07-25

    Prediction of essential genes helps to identify a minimal set of genes that are absolutely required for the appropriate functioning and survival of a cell. The available machine learning techniques for essential gene prediction have inherent problems, like imbalanced provision of training datasets, biased choice of the best model for a given balanced dataset, choice of a complex machine learning algorithm, and data-based automated selection of biologically relevant features for classification. Here, we propose a simple support vector machine-based learning strategy for the prediction of essential genes in Escherichia coli K-12 MG1655 metabolism that integrates a non-conventional combination of an appropriate sample balanced training set, a unique organism-specific genotype, phenotype attributes that characterize essential genes, and optimal parameters of the learning algorithm to generate the best machine learning model (the model with the highest accuracy among all the models trained for different sample training sets). For the first time, we also introduce flux-coupled metabolic subnetwork-based features for enhancing the classification performance. Our strategy proves to be superior as compared to previous SVM-based strategies in obtaining a biologically relevant classification of genes with high sensitivity and specificity. This methodology was also trained with datasets of other recent supervised classification techniques for essential gene classification and tested using reported test datasets. The testing accuracy was always high as compared to the known techniques, proving that our method outperforms known methods. Observations from our study indicate that essential genes are conserved among homologous bacterial species, demonstrate high codon usage bias, GC content and gene expression, and predominantly possess a tendency to form physiological flux modules in metabolism.

  4. A novel numerical model to predict the morphological behavior of magnetic liquid marbles using coarse grained molecular dynamics concepts

    NASA Astrophysics Data System (ADS)

    Polwaththe-Gallage, Hasitha-Nayanajith; Sauret, Emilie; Nguyen, Nam-Trung; Saha, Suvash C.; Gu, YuanTong

    2018-01-01

    Liquid marbles are liquid droplets coated with superhydrophobic powders whose morphology is governed by the gravitational and surface tension forces. Small liquid marbles take spherical shapes, while larger liquid marbles exhibit puddle shapes due to the dominance of gravitational forces. Liquid marbles coated with hydrophobic magnetic powders respond to an external magnetic field. This unique feature of magnetic liquid marbles is very attractive for digital microfluidics and drug delivery systems. Several experimental studies have reported the behavior of the liquid marbles. However, the complete behavior of liquid marbles under various environmental conditions is yet to be understood. Modeling techniques can be used to predict the properties and the behavior of the liquid marbles effectively and efficiently. A robust liquid marble model will inspire new experiments and provide new insights. This paper presents a novel numerical modeling technique to predict the morphology of magnetic liquid marbles based on coarse grained molecular dynamics concepts. The proposed model is employed to predict the changes in height of a magnetic liquid marble against its width and compared with the experimental data. The model predictions agree well with the experimental findings. Subsequently, the relationship between the morphology of a liquid marble with the properties of the liquid is investigated. Furthermore, the developed model is capable of simulating the reversible process of opening and closing of the magnetic liquid marble under the action of a magnetic force. The scaling analysis shows that the model predictions are consistent with the scaling laws. Finally, the proposed model is used to assess the compressibility of the liquid marbles. The proposed modeling approach has the potential to be a powerful tool to predict the behavior of magnetic liquid marbles serving as bioreactors.

  5. Quality assessment of a new surgical simulator for neuroendoscopic training.

    PubMed

    Filho, Francisco Vaz Guimarães; Coelho, Giselle; Cavalheiro, Sergio; Lyra, Marcos; Zymberg, Samuel T

    2011-04-01

    Ideal surgical training models should be entirely reliable, atoxic, easy to handle, and, if possible, low cost. All available models have their advantages and disadvantages. The choice of one or another will depend on the type of surgery to be performed. The authors created an anatomical model called the S.I.M.O.N.T. (Sinus Model Oto-Rhino Neuro Trainer) Neurosurgical Endotrainer, which can provide reliable neuroendoscopic training. The aim in the present study was to assess both the quality of the model and the development of surgical skills by trainees. The S.I.M.O.N.T. is built of a synthetic thermoretractable, thermosensible rubber called Neoderma, which, combined with different polymers, produces more than 30 different formulas. Quality assessment of the model was based on qualitative and quantitative data obtained from training sessions with 9 experienced and 13 inexperienced neurosurgeons. The techniques used for evaluation were face validation, retest and interrater reliability, and construct validation. The experts considered the S.I.M.O.N.T. capable of reproducing surgical situations as if they were real and presenting great similarity with the human brain. Surgical results of serial training showed that the model could be considered precise. Finally, development and improvement in surgical skills by the trainees were observed and considered relevant to further training. It was also observed that the probability of any single error was dramatically decreased after each training session, with a mean reduction of 41.65% (range 38.7%-45.6%). Neuroendoscopic training has some specific requirements. A unique set of instruments is required, as is a model that can resemble real-life situations. The S.I.M.O.N.T. is a new alternative model specially designed for this purpose. Validation techniques followed by precision assessments attested to the model's feasibility.

  6. Uncertainty quantification in flux balance analysis of spatially lumped and distributed models of neuron-astrocyte metabolism.

    PubMed

    Calvetti, Daniela; Cheng, Yougan; Somersalo, Erkki

    2016-12-01

    Identifying feasible steady state solutions of a brain energy metabolism model is an inverse problem that allows infinitely many solutions. The characterization of the non-uniqueness, or the uncertainty quantification of the flux balance analysis, is tantamount to identifying the degrees of freedom of the solution. The degrees of freedom of multi-compartment mathematical models for energy metabolism of a neuron-astrocyte complex may offer a key to understand the different ways in which the energetic needs of the brain are met. In this paper we study the uncertainty in the solution, using techniques of linear algebra to identify the degrees of freedom in a lumped model, and Markov chain Monte Carlo methods in its extension to a spatially distributed case. The interpretation of the degrees of freedom in metabolic terms, more specifically, glucose and oxygen partitioning, is then leveraged to derive constraints on the free parameters to guarantee that the model is energetically feasible. We demonstrate how the model can be used to estimate the stoichiometric energy needs of the cells as well as the household energy based on the measured oxidative cerebral metabolic rate of glucose and glutamate cycling. Moreover, our analysis shows that in the lumped model the net direction of lactate dehydrogenase (LDH) in the cells can be deduced from the glucose partitioning between the compartments. The extension of the lumped model to a spatially distributed multi-compartment setting that includes diffusion fluxes from capillary to tissue increases the number of degrees of freedom, requiring the use of statistical sampling techniques. The analysis of the distributed model reveals that some of the conclusions valid for the spatially lumped model, e.g., concerning the LDH activity and glucose partitioning, may no longer hold.

  7. A split-step method to include electron–electron collisions via Monte Carlo in multiple rate equation simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huthmacher, Klaus; Molberg, Andreas K.; Rethfeld, Bärbel

    2016-10-01

    A split-step numerical method for calculating ultrafast free-electron dynamics in dielectrics is introduced. The two split steps, independently programmed in C++11 and FORTRAN 2003, are interfaced via the presented open source wrapper. The first step solves a deterministic extended multi-rate equation for the ionization, electron–phonon collisions, and single photon absorption by free-carriers. The second step is stochastic and models electron–electron collisions using Monte-Carlo techniques. This combination of deterministic and stochastic approaches is a unique and efficient method of calculating the nonlinear dynamics of 3D materials exposed to high intensity ultrashort pulses. Results from simulations solving the proposed model demonstrate howmore » electron–electron scattering relaxes the non-equilibrium electron distribution on the femtosecond time scale.« less

  8. Light manipulation with flat and conformal inhomogeneous dispersive impedance sheets: an efficient FDTD modeling.

    PubMed

    Jafar-Zanjani, Samad; Cheng, Jierong; Mosallaei, Hossein

    2016-04-10

    An efficient auxiliary differential equation method for incorporating 2D inhomogeneous dispersive impedance sheets in the finite-difference time-domain solver is presented. This unique proposed method can successfully solve optical problems of current interest involving 2D sheets. It eliminates the need for ultrafine meshing in the thickness direction, resulting in a significant reduction of computation time and memory requirements. We apply the method to characterize a novel broad-beam leaky-wave antenna created by cascading three sinusoidally modulated reactance surfaces and also to study the effect of curvature on the radiation characteristic of a conformal impedance sheet holographic antenna. Considerable improvement in the simulation time based on our technique in comparison with the traditional volumetric model is reported. Both applications are of great interest in the field of antennas and 2D sheets.

  9. Genetic architecture of the Delis-Kaplan Executive Function System Trail Making Test: evidence for distinct genetic influences on executive function.

    PubMed

    Vasilopoulos, Terrie; Franz, Carol E; Panizzon, Matthew S; Xian, Hong; Grant, Michael D; Lyons, Michael J; Toomey, Rosemary; Jacobson, Kristen C; Kremen, William S

    2012-03-01

    To examine how genes and environments contribute to relationships among Trail Making Test (TMT) conditions and the extent to which these conditions have unique genetic and environmental influences. Participants included 1,237 middle-aged male twins from the Vietnam Era Twin Study of Aging. The Delis-Kaplan Executive Function System TMT included visual searching, number and letter sequencing, and set-shifting components. Phenotypic correlations among TMT conditions ranged from 0.29 to 0.60, and genes accounted for the majority (58-84%) of each correlation. Overall heritability ranged from 0.34 to 0.62 across conditions. Phenotypic factor analysis suggested a single factor. In contrast, genetic models revealed a single common genetic factor but also unique genetic influences separate from the common factor. Genetic variance (i.e., heritability) of number and letter sequencing was completely explained by the common genetic factor while unique genetic influences separate from the common factor accounted for 57% and 21% of the heritabilities of visual search and set shifting, respectively. After accounting for general cognitive ability, unique genetic influences accounted for 64% and 31% of those heritabilities. A common genetic factor, most likely representing a combination of speed and sequencing, accounted for most of the correlation among TMT 1-4. Distinct genetic factors, however, accounted for a portion of variance in visual scanning and set shifting. Thus, although traditional phenotypic shared variance analysis techniques suggest only one general factor underlying different neuropsychological functions in nonpatient populations, examining the genetic underpinnings of cognitive processes with twin analysis can uncover more complex etiological processes.

  10. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  11. Putting 3D modelling and 3D printing into practice: virtual surgery and preoperative planning to reconstruct complex post-traumatic skeletal deformities and defects

    PubMed Central

    Tetsworth, Kevin; Block, Steve; Glatt, Vaida

    2017-01-01

    3D printing technology has revolutionized and gradually transformed manufacturing across a broad spectrum of industries, including healthcare. Nowhere is this more apparent than in orthopaedics with many surgeons already incorporating aspects of 3D modelling and virtual procedures into their routine clinical practice. As a more extreme application, patient-specific 3D printed titanium truss cages represent a novel approach for managing the challenge of segmental bone defects. This review illustrates the potential indications of this innovative technique using 3D printed titanium truss cages in conjunction with the Masquelet technique. These implants are custom designed during a virtual surgical planning session with the combined input of an orthopaedic surgeon, an orthopaedic engineering professional and a biomedical design engineer. The ability to 3D model an identical replica of the original intact bone in a virtual procedure is of vital importance when attempting to precisely reconstruct normal anatomy during the actual procedure. Additionally, other important factors must be considered during the planning procedure, such as the three-dimensional configuration of the implant. Meticulous design is necessary to allow for successful implantation through the planned surgical exposure, while being aware of the constraints imposed by local anatomy and prior implants. This review will attempt to synthesize the current state of the art as well as discuss our personal experience using this promising technique. It will address implant design considerations including the mechanical, anatomical and functional aspects unique to each case. PMID:28220752

  12. Putting 3D modelling and 3D printing into practice: virtual surgery and preoperative planning to reconstruct complex post-traumatic skeletal deformities and defects.

    PubMed

    Tetsworth, Kevin; Block, Steve; Glatt, Vaida

    2017-01-01

    3D printing technology has revolutionized and gradually transformed manufacturing across a broad spectrum of industries, including healthcare. Nowhere is this more apparent than in orthopaedics with many surgeons already incorporating aspects of 3D modelling and virtual procedures into their routine clinical practice. As a more extreme application, patient-specific 3D printed titanium truss cages represent a novel approach for managing the challenge of segmental bone defects. This review illustrates the potential indications of this innovative technique using 3D printed titanium truss cages in conjunction with the Masquelet technique. These implants are custom designed during a virtual surgical planning session with the combined input of an orthopaedic surgeon, an orthopaedic engineering professional and a biomedical design engineer. The ability to 3D model an identical replica of the original intact bone in a virtual procedure is of vital importance when attempting to precisely reconstruct normal anatomy during the actual procedure. Additionally, other important factors must be considered during the planning procedure, such as the three-dimensional configuration of the implant. Meticulous design is necessary to allow for successful implantation through the planned surgical exposure, while being aware of the constraints imposed by local anatomy and prior implants. This review will attempt to synthesize the current state of the art as well as discuss our personal experience using this promising technique. It will address implant design considerations including the mechanical, anatomical and functional aspects unique to each case. © The Authors, published by EDP Sciences, 2017.

  13. Unique Outcomes in the Narratives of Young Adults Who Experienced Dating Violence as Adolescents.

    PubMed

    Draucker, Claire Burke; Smith, Carolyn; Mazurczyk, Jill; Thomas, Destini; Ramirez, Patricia; McNealy, Kim; Thomas, Jade; Martsolf, Donna S

    2016-01-01

    Narrative therapy, an approach based on the reauthoring of life narratives, may be a useful psychotherapeutic strategy for youth who have experienced dating violence. A cornerstone of narrative therapy is the concept of unique outcomes, which are moments that stand in contrast to a client's otherwise problem-saturated narratives. The purpose of this study was to identify and categorize unique outcomes embedded in narratives about adolescent dating violence. Text units representing unique outcomes were extracted from transcripts of interviews with 88 young adults who had experienced dating violence and were categorized using standard content analytic techniques. Six categories of unique outcome stories were identified: facing-facts stories, standing-up-for-myself stories, cutting-it-off stories, cutting-'em-loose stories, getting-back-on-track stories, and changing-it-up stories. This typology of unique outcomes can inform clinicians who work with clients who have a history of adolescent dating violence. © The Author(s) 2015.

  14. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  15. Hoshin Kanri: a technique for strategic quality management.

    PubMed

    Tennant, C; Roberts, P A

    2000-01-01

    This paper describes a technique for Strategic Quality Management (SQM), known as Hoshin Kanri, which has been operated as a management system in many Japanese companies since the 1960s. It represents a core aspect of Japanese companies' management systems, and is stated as: the means by which the overall control system and Total Quality Management (TQM) are deployed. Hoshin Kanri is not particularly unique in its concept of establishing and tracking individual goals and objectives, but the manner in which the objectives and the means to achieve them are developed and deployed is. The problem with applying the concept of Strategic Quality Management (SQM) using Hoshin Kanri, is that it can tend to challenge the traditional authoritarian strategic planning models, which have become the paradigms of modern business. Yet Hoshin Kanri provides an appropriate tool for declaration of the strategic vision for the business while integrating goals and targets in a single holistic model. There have been various adaptations of Hoshin Kanri to align the technique to Western thinking and management approaches, yet outside Japan its significance has gone largely unreported. It is proposed that Hoshin Kanri is an effective methodology for SQM, which has a number of benefits over the more conventional planning techniques. The benefits of Hoshin Kanri as a tool for Strategic Quality Management (SQM) compared to conventional planning systems include: integration of strategic objectives with tactical daily management, the application of the plan-do-check-act cycle to business process management, parallel planning and execution methodology, company wide approach, improvements in communication, increased consensus and buy-in to goal setting, and cross-functional-management integration.

  16. Compact Microscope Imaging System With Intelligent Controls Improved

    NASA Technical Reports Server (NTRS)

    McDowell, Mark

    2004-01-01

    The Compact Microscope Imaging System (CMIS) with intelligent controls is a diagnostic microscope analysis tool with intelligent controls for use in space, industrial, medical, and security applications. This compact miniature microscope, which can perform tasks usually reserved for conventional microscopes, has unique advantages in the fields of microscopy, biomedical research, inline process inspection, and space science. Its unique approach integrates a machine vision technique with an instrumentation and control technique that provides intelligence via the use of adaptive neural networks. The CMIS system was developed at the NASA Glenn Research Center specifically for interface detection used for colloid hard spheres experiments; biological cell detection for patch clamping, cell movement, and tracking; and detection of anode and cathode defects for laboratory samples using microscope technology.

  17. Design and Characterization of Next-Generation Micromirrors Fabricated in a Four-Level, Planarized Surface-Micromachined Polycrystalline Silicon Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalicek, M.A.; Comtois, J.H.; Barron, C.C.

    This paper describes the design and characterization of several types of micromirror devices to include process capabilities, device modeling, and test data resulting in deflection versus applied potential curves. These micromirror devices are the first to be fabricated in the state-of-the-art four-level planarized polysilicon process available at Sandia National Laboratories known as the Sandia Ultra-planar Multi-level MEMS Technology (SUMMiT). This enabling process permits the development of micromirror devices with near-ideal characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics as elevated address electrodes, individual address wiring beneath the device, planarized mirror surfaces usingmore » Chemical Mechanical Polishing (CMP), unique post-process metallization, and the best active surface area to date. This paper presents the design, fabrication, modeling, and characterization of several variations of Flexure-Beam (FBMD) and Axial-Rotation Micromirror Devices (ARMD). The released devices are first metallized using a standard sputtering technique relying on metallization guards and masks that are fabricated next to the devices. Such guards are shown to enable the sharing of bond pads between numerous arrays of micromirrors in order to maximize the number of on-chip test arrays. The devices are modeled and then empirically characterized using a laser interferometer setup located at the Air Force Institute of Technology (AFIT) at Wright-Patterson AFB in Dayton, Ohio. Unique design considerations for these devices and the process are also discussed.« less

  18. Differentiation of Crataegus spp. guided by nuclear magnetic resonance spectrometry with chemometric analyses.

    PubMed

    Lund, Jensen A; Brown, Paula N; Shipley, Paul R

    2017-09-01

    For compliance with US Current Good Manufacturing Practice regulations for dietary supplements, manufacturers must provide identity of source plant material. Despite the popularity of hawthorn as a dietary supplement, relatively little is known about the comparative phytochemistry of different hawthorn species, and in particular North American hawthorns. The combination of NMR spectrometry with chemometric analyses offers an innovative approach to differentiating hawthorn species and exploring the phytochemistry. Two European and two North American species, harvested from a farm trial in late summer 2008, were analyzed by standard 1D 1 H and J-resolved (JRES) experiments. The data were preprocessed and modelled by principal component analysis (PCA). A supervised model was then generated by partial least squares-discriminant analysis (PLS-DA) for classification and evaluated by cross validation. Supervised random forests models were constructed from the dataset to explore the potential of machine learning for identification of unique patterns across species. 1D 1 H NMR data yielded increased differentiation over the JRES data. The random forests results correlated with PLS-DA results and outperformed PLS-DA in classification accuracy. In all of these analyses differentiation of the Crataegus spp. was best achieved by focusing on the NMR spectral region that contains signals unique to plant phenolic compounds. Identification of potentially significant metabolites for differentiation between species was approached using univariate techniques including significance analysis of microarrays and Kruskall-Wallis tests. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Cancer treatment model with the Caputo-Fabrizio fractional derivative

    NASA Astrophysics Data System (ADS)

    Ali Dokuyucu, Mustafa; Celik, Ercan; Bulut, Hasan; Mehmet Baskonus, Haci

    2018-03-01

    In this article, a model for cancer treatment is examined. The model is integrated into the Caputo-Fabrizio fractional derivative first, to examine the existence of the solution. Then, the uniqueness of the solution is investigated and we identified under which conditions the model provides a unique solution.

  20. Metalworking Techniques Unlock a Unique Alloy

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Approached by West Hartford, Connecticut-based Abbot Ball Company, Glenn Research Center agreed to test an intriguing alloy called Nitinol 60 that had been largely unused for a half century. Using powdered metallurgy, the partners developed a method for manufacturing and working with the material, which Abbott Ball has now commercialized. Nitinol 60 provides a unique combination of qualities that make it an excellent material for ball bearings, among other applications.

  1. A Unique Method of Retention for Gum Stripper- A Case Report

    PubMed Central

    T.S., Priyanka

    2014-01-01

    Successful restoration of partially edentulous situations, especially kennedy’s class-I, II &IV requires lot of contemporary and conventional treatment approaches. Semi precision attachments play a major role in retention of clinically challenging partially edentulous situation. Attachment retained partial dentures can be one of the successful treatment option in prosthdontics. This article presents a unique technique of retaining gum stripper using semi precision attachments. PMID:25654046

  2. Imaging laminar structures in the gray matter with diffusion MRI.

    PubMed

    Assaf, Yaniv

    2018-01-05

    The cortical layers define the architecture of the gray matter and its neuroanatomical regions and are essential for brain function. Abnormalities in cortical layer development, growth patterns, organization, or size can affect brain physiology and cognition. Unfortunately, while large population studies are underway that will greatly increase our knowledge about these processes, current non-invasive techniques for characterizing the cortical layers remain inadequate. For decades, high-resolution T1 and T2 Weighted Magnetic Resonance Imaging (MRI) have been the method-of-choice for gray matter and layer characterization. In the past few years, however, diffusion MRI has shown increasing promise for its unique insights into the fine structure of the cortex. Several different methods, including surface analysis, connectivity exploration, and sub-voxel component modeling, are now capable of exploring the diffusion characteristics of the cortex. In this review, we will discuss current advances in the application of diffusion imaging for cortical characterization and its unique features, with a particular emphasis on its spatial resolution, arguably its greatest limitation. In addition, we will explore the relationship between the diffusion MRI signal and the cellular components of the cortex, as visualized by histology. While the obstacles facing the widespread application of cortical diffusion imaging remain daunting, the information it can reveal may prove invaluable. Within the next few years, we predict a surge in the application of this technique and a concomitant expansion of our knowledge of cortical layers. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Functional mesoporous silica nanoparticles for bio-imaging applications.

    PubMed

    Cha, Bong Geun; Kim, Jaeyun

    2018-03-22

    Biomedical investigations using mesoporous silica nanoparticles (MSNs) have received significant attention because of their unique properties including controllable mesoporous structure, high specific surface area, large pore volume, and tunable particle size. These unique features make MSNs suitable for simultaneous diagnosis and therapy with unique advantages to encapsulate and load a variety of therapeutic agents, deliver these agents to the desired location, and release the drugs in a controlled manner. Among various clinical areas, nanomaterials-based bio-imaging techniques have advanced rapidly with the development of diverse functional nanoparticles. Due to the unique features of MSNs, an imaging agent supported by MSNs can be a promising system for developing targeted bio-imaging contrast agents with high structural stability and enhanced functionality that enable imaging of various modalities. Here, we review the recent achievements on the development of functional MSNs for bio-imaging applications, including optical imaging, magnetic resonance imaging (MRI), positron emission tomography (PET), computed tomography (CT), ultrasound imaging, and multimodal imaging for early diagnosis. With further improvement in noninvasive bio-imaging techniques, the MSN-supported imaging agent systems are expected to contribute to clinical applications in the future. This article is categorized under: Diagnostic Tools > In vivo Nanodiagnostics and Imaging Nanotechnology Approaches to Biology > Nanoscale Systems in Biology. © 2018 Wiley Periodicals, Inc.

  4. Experimental Techniques for Thermodynamic Measurements of Ceramics

    NASA Technical Reports Server (NTRS)

    Jacobson, Nathan S.; Putnam, Robert L.; Navrotsky, Alexandra

    1999-01-01

    Experimental techniques for thermodynamic measurements on ceramic materials are reviewed. For total molar quantities, calorimetry is used. Total enthalpies are determined with combustion calorimetry or solution calorimetry. Heat capacities and entropies are determined with drop calorimetry, differential thermal methods, and adiabatic calorimetry . Three major techniques for determining partial molar quantities are discussed. These are gas equilibration techniques, Knudsen cell methods, and electrochemical techniques. Throughout this report, issues unique to ceramics are emphasized. Ceramic materials encompass a wide range of stabilities and this must be considered. In general data at high temperatures is required and the need for inert container materials presents a particular challenge.

  5. 3D Printing Based on Cardiac CT Assists Anatomic Visualization Prior to Transcatheter Aortic Valve Replacement

    PubMed Central

    Ripley, Beth; Kelil, Tatiana; Cheezum, Michael K.; Goncalves, Alexandra; Di Carli, Marcelo F.; Rybicki, Frank J.; Steigner, Mike; Mitsouras, Dimitrios; Blankstein, Ron

    2017-01-01

    Background 3D printing is a promising technique that may have applications in medicine, and there is expanding interest in the use of patient-specific 3D models to guide surgical interventions. Objective To determine the feasibility of using cardiac CT to print individual models of the aortic root complex for transcatheter aortic valve replacement (TAVR) planning as well as to determine the ability to predict paravalvular aortic regurgitation (PAR). Methods This retrospective study included 16 patients (9 with PAR identified on blinded interpretation of post-procedure trans-thoracic echocardiography and 7 age, sex, and valve size-matched controls with no PAR). 3D printed models of the aortic root were created from pre-TAVR cardiac computed tomography data. These models were fitted with printed valves and predictions regarding post-implant PAR were made using a light transmission test. Results Aortic root 3D models were highly accurate, with excellent agreement between annulus measurements made on 3D models and those made on corresponding 2D data (mean difference of −0.34 mm, 95% limits of agreement: ± 1.3 mm). The 3D printed valve models were within 0.1 mm of their designed dimensions. Examination of the fit of valves within patient-specific aortic root models correctly predicted PAR in 6 of 9 patients (6 true positive, 3 false negative) and absence of PAR in 5 of 7 patients (5 true negative, 2 false positive). Conclusions Pre-TAVR 3D-printing based on cardiac CT provides a unique patient-specific method to assess the physical interplay of the aortic root and implanted valves. With additional optimization, 3D models may complement traditional techniques used for predicting which patients are more likely to develop PAR. PMID:26732862

  6. An Investigation to Advance the Technology Readiness Level of the Centaur Derived On-orbit Propellant Storage and Transfer System

    NASA Astrophysics Data System (ADS)

    Silvernail, Nathan L.

    This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.

  7. Theoretical and numerical investigations towards a new geoid model for the Mediterranean Sea - The GEOMED2 project

    NASA Astrophysics Data System (ADS)

    Barzaghi, Riccardo; Vergos, Georgios S.; Albertella, Alberta; Carrion, Daniela; Cazzaniga, Noemi; Tziavos, Ilias N.; Grigoriadis, Vassilios N.; Natsiopoulos, Dimitrios A.; Bruinsma, Sean; Bonvalot, Sylvain; Lequentrec-Lalancette, Marie-Françoise; Bonnefond, Pascal; Knudsen, Per; Andersen, Ole; Simav, Mehmet; Yildiz, Hasan; Basic, Tomislav; Gil, Antonio J.

    2016-04-01

    The unique features of the Mediterranean Sea, with its large gravity variations, complex circulation, and geodynamic peculiarities have always constituted this semi-enclosed sea area as a unique geodetic, geodynamics and ocean laboratory. The main scope of the GEOMED 2 project is the collection of all available gravity, topography/bathymetry and satellite altimetry data in order to improve the representation of the marine geoid and estimate the Mean Dynamic sea surface Topography (MDT) and the circulation with higher accuracy and resolution. Within GEOMED2, the data employed are land and marine gravity data, GOCE/GRACE based Global Geopotential Models and a combination after proper validation of MISTRAL, HOMONIM and SRTM/bathymetry terrain models. In this work we present the results achieved for an inner test region spanning the Adriatic Sea area, bounded between 36o < φ < 48o and 10o < λ < 22o. Within this test region, the available terrain/bathymetry models have been evaluated in terms of their contribution to geoid modeling, the processing methodologies have been tested in terms of the provided geoid accuracy and finally some preliminary results on the MDT determination have been compiled. The aforementioned will server as the guide for the Mediterranean-wide marine geoid estimation. The processing methodology was based on the well-known remove-compute-restore method following both stochastic and spectral methods. Classic least-squares collocation (LSC) with errors has been employed, along with fast Fourier transform (FFT)-based techniques, the Least-Squares Modification of Stokes' Formula (KTH) method and windowed LSC. All methods have been evaluated against in-situ collocated GPS/Levelling geoid heights, using EGM2008 as a reference, in order to conclude on the one(s) to be used for the basin-wide geoid evaluation.

  8. A Novel mRNA Level Subtraction Method for Quick Identification of Target-Orientated Uniquely Expressed Genes Between Peanut Immature Pod and Leaf

    PubMed Central

    2010-01-01

    Subtraction technique has been broadly applied for target gene discovery. However, most current protocols apply relative differential subtraction and result in great amount clone mixtures of unique and differentially expressed genes. This makes it more difficult to identify unique or target-orientated expressed genes. In this study, we developed a novel method for subtraction at mRNA level by integrating magnetic particle technology into driver preparation and tester–driver hybridization to facilitate uniquely expressed gene discovery between peanut immature pod and leaf through a single round subtraction. The resulting target clones were further validated through polymerase chain reaction screening using peanut immature pod and leaf cDNA libraries as templates. This study has resulted in identifying several genes expressed uniquely in immature peanut pod. These target genes can be used for future peanut functional genome and genetic engineering research. PMID:21406066

  9. Ionospheric tomography using Faraday rotation of Automatic Dependent Surveillance Broadcast (UHF) signals Ionospheric Measurement From ADS-B Signals

    NASA Astrophysics Data System (ADS)

    Cushley, Alex Clay

    The proposed launch of a CubeSat carrying the first space-borne ADS-B receiver by RMCC will create a unique opportunity to study the modification of radio waves following propagation through the ionosphere as the signals propagate from the transmitting aircraft to the passive satellite receiver(s). Experimental work is described which successfully demonstrated that ADS-B data can be used to reconstruct two-dimensional electron density maps of the ionosphere using techniques from computerized tomography. Ray-tracing techniques are used to determine the characteristics of individual waves, including the wave path and the state of polarization at the satellite receiver. The modelled Faraday rotation is determined and converted to TEC along the ray-paths. The resulting TEC is used as input for CIT using ART. This study concentrated on meso-scale structures 100--1000 km in horizontal extent. The primary scientific interest of this thesis was to show the feasibility of a new method to image the ionosphere and obtain a better understanding of magneto-ionic wave propagation. Keywords: Automatic Dependent Surveillance-Broadcast (ADS-B), Faraday rotation, electromagnetic (EM) waves, radio frequency (RF) propagation, ionosphere (auroral, irregularities, instruments and techniques), electron density profile, total electron content (TEC), computer ionospheric tomography (CIT), algebraic reconstruction technique (ART).

  10. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  11. Orthogonal Electric Field Measurements near the Green Fluorescent Protein Fluorophore through Stark Effect Spectroscopy and pKa Shifts Provide a Unique Benchmark for Electrostatics Models.

    PubMed

    Slocum, Joshua D; First, Jeremy T; Webb, Lauren J

    2017-07-20

    Measurement of the magnitude, direction, and functional importance of electric fields in biomolecules has been a long-standing experimental challenge. pK a shifts of titratable residues have been the most widely implemented measurements of the local electrostatic environment around the labile proton, and experimental data sets of pK a shifts in a variety of systems have been used to test and refine computational prediction capabilities of protein electrostatic fields. A more direct and increasingly popular technique to measure electric fields in proteins is Stark effect spectroscopy, where the change in absorption energy of a chromophore relative to a reference state is related to the change in electric field felt by the chromophore. While there are merits to both of these methods and they are both reporters of local electrostatic environment, they are fundamentally different measurements, and to our knowledge there has been no direct comparison of these two approaches in a single protein. We have recently demonstrated that green fluorescent protein (GFP) is an ideal model system for measuring changes in electric fields in a protein interior caused by amino acid mutations using both electronic and vibrational Stark effect chromophores. Here we report the changes in pK a of the GFP fluorophore in response to the same mutations and show that they are in excellent agreement with Stark effect measurements. This agreement in the results of orthogonal experiments reinforces our confidence in the experimental results of both Stark effect and pK a measurements and provides an excellent target data set to benchmark diverse protein electrostatics calculations. We used this experimental data set to test the pK a prediction ability of the adaptive Poisson-Boltzmann solver (APBS) and found that a simple continuum dielectric model of the GFP interior is insufficient to accurately capture the measured pK a and Stark effect shifts. We discuss some of the limitations of this continuum-based model in this system and offer this experimentally self-consistent data set as a target benchmark for electrostatics models, which could allow for a more rigorous test of pK a prediction techniques due to the unique environment of the water-filled GFP barrel compared to traditional globular proteins.

  12. Old Wine in New Bottles: The Quality of Work Life in Schools and School Districts.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.; Mitchell, Stephen M.

    This essay reviews quality of work life as a management technique and argues that quality-of-work-life programs, conceptualized multidimensionally, offer a unique mechanism for improving working conditions in schools and within districts. A brief analysis of major management ideologies concludes that some techniques advocated under the label of…

  13. Exploring the Nonformal Adult Educator in Twenty-First Century Contexts Using Qualitative Video Data Analysis Techniques

    ERIC Educational Resources Information Center

    Alston, Geleana Drew; Ellis-Hervey, Nina

    2015-01-01

    This study examined how "YouTube" creates a unique, nonformal cyberspace for Black females to vlog about natural hair. Specifically, we utilized qualitative video data analysis techniques to understand how using "YouTube" as a facilitation tool has the ability to collectively capture and maintain an audience of more than a…

  14. Ballet as Somatic Practice: A Case Study Exploring the Integration of Somatic Practices in Ballet Pedagogy

    ERIC Educational Resources Information Center

    Berg, Tanya

    2017-01-01

    This case study explores one teacher's integration of Alexander Technique and the work of neuromuscular retrainer Irene Dowd in ballet pedagogy to establish a somatic approach to teaching, learning, and performing ballet technique. This case study highlights the teacher's unique teaching method called IMAGE TECH for dancers (ITD) and offers…

  15. Teaching Web Search Skills: Techniques and Strategies of Top Trainers

    ERIC Educational Resources Information Center

    Notess, Greg R.

    2006-01-01

    Here is a unique and practical reference for anyone who teaches Web searching. Greg Notess shares his own techniques and strategies along with expert tips and advice from a virtual "who's who" of Web search training: Joe Barker, Paul Barron, Phil Bradley, John Ferguson, Alice Fulbright, Ran Hock, Jeff Humphrey, Diane Kovacs, Gary Price, Danny…

  16. The "Individualised Accounting Questions" Technique: Using Excel to Generate Quantitative Exercises for Large Classes with Unique Individual Answers

    ERIC Educational Resources Information Center

    Nnadi, Matthias; Rosser, Mike

    2014-01-01

    The "individualised accounting questions" (IAQ) technique set out in this paper encourages independent active learning. It enables tutors to set individualised accounting questions and construct an answer grid that can be used for any number of students, with numerical values for each student's answers based on their student enrolment…

  17. Can modeling of HIV treatment processes improve outcomes? Capitalizing on an operations research approach to the global pandemic

    PubMed Central

    Xiong, Wei; Hupert, Nathaniel; Hollingsworth, Eric B; O'Brien, Megan E; Fast, Jessica; Rodriguez, William R

    2008-01-01

    Background Mathematical modeling has been applied to a range of policy-level decisions on resource allocation for HIV care and treatment. We describe the application of classic operations research (OR) techniques to address logistical and resource management challenges in HIV treatment scale-up activities in resource-limited countries. Methods We review and categorize several of the major logistical and operational problems encountered over the last decade in the global scale-up of HIV care and antiretroviral treatment for people with AIDS. While there are unique features of HIV care and treatment that pose significant challenges to effective modeling and service improvement, we identify several analogous OR-based solutions that have been developed in the service, industrial, and health sectors. Results HIV treatment scale-up includes many processes that are amenable to mathematical and simulation modeling, including forecasting future demand for services; locating and sizing facilities for maximal efficiency; and determining optimal staffing levels at clinical centers. Optimization of clinical and logistical processes through modeling may improve outcomes, but successful OR-based interventions will require contextualization of response strategies, including appreciation of both existing health care systems and limitations in local health workforces. Conclusion The modeling techniques developed in the engineering field of operations research have wide potential application to the variety of logistical problems encountered in HIV treatment scale-up in resource-limited settings. Increasing the number of cross-disciplinary collaborations between engineering and public health will help speed the appropriate development and application of these tools. PMID:18680594

  18. Modeling L2,3-Edge X-ray Absorption Spectroscopy with Real-Time Exact Two-Component Relativistic Time-Dependent Density Functional Theory.

    PubMed

    Kasper, Joseph M; Lestrange, Patrick J; Stetina, Torin F; Li, Xiaosong

    2018-04-10

    X-ray absorption spectroscopy is a powerful technique to probe local electronic and nuclear structure. There has been extensive theoretical work modeling K-edge spectra from first principles. However, modeling L-edge spectra directly with density functional theory poses a unique challenge requiring further study. Spin-orbit coupling must be included in the model, and a noncollinear density functional theory is required. Using the real-time exact two-component method, we are able to variationally include one-electron spin-orbit coupling terms when calculating the absorption spectrum. The abilities of different basis sets and density functionals to model spectra for both closed- and open-shell systems are investigated using SiCl 4 and three transition metal complexes, TiCl 4 , CrO 2 Cl 2 , and [FeCl 6 ] 3- . Although we are working in the real-time framework, individual molecular orbital transitions can still be recovered by projecting the density onto the ground state molecular orbital space and separating contributions to the time evolving dipole moment.

  19. Supervised Gamma Process Poisson Factorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dylan Zachary

    This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling andmore » several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.« less

  20. On the Relations among Regular, Equal Unique Variances, and Image Factor Analysis Models.

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Bentler, Peter M.

    2000-01-01

    Investigated the conditions under which the matrix of factor loadings from the factor analysis model with equal unique variances will give a good approximation to the matrix of factor loadings from the regular factor analysis model. Extends the results to the image factor analysis model. Discusses implications for practice. (SLD)

  1. Four dimensional studies in earth space

    NASA Technical Reports Server (NTRS)

    Mather, R. S.

    1972-01-01

    A system of reference which is directly related to observations, is proposed for four-dimensional studies in earth space. Global control network and polar wandering are defined. The determination of variations in the earth's gravitational field with time also forms part of such a system. Techniques are outlined for the unique definition of the motion of the geocenter, and the changes in the location of the axis of rotation of an instantaneous earth model, in relation to values at some epoch of reference. The instantaneous system referred to is directly related to a fundamental equation in geodynamics. The reference system defined would provide an unambiguous frame for long period studies in earth space, provided the scale of the space were specified.

  2. On the Solutions of a 2+1-Dimensional Model for Epitaxial Growth with Axial Symmetry

    NASA Astrophysics Data System (ADS)

    Lu, Xin Yang

    2018-04-01

    In this paper, we study the evolution equation derived by Xu and Xiang (SIAM J Appl Math 69(5):1393-1414, 2009) to describe heteroepitaxial growth in 2+1 dimensions with elastic forces on vicinal surfaces is in the radial case and uniform mobility. This equation is strongly nonlinear and contains two elliptic integrals and defined via Cauchy principal value. We will first derive a formally equivalent parabolic evolution equation (i.e., full equivalence when sufficient regularity is assumed), and the main aim is to prove existence, uniqueness and regularity of strong solutions. We will extensively use techniques from the theory of evolution equations governed by maximal monotone operators in Banach spaces.

  3. Watchdog activity monitor (WAM) for use wth high coverage processor self-test

    NASA Technical Reports Server (NTRS)

    Tulpule, Bhalchandra R. (Inventor); Crosset, III, Richard W. (Inventor); Versailles, Richard E. (Inventor)

    1988-01-01

    A high fault coverage, instruction modeled self-test for a signal processor in a user environment is disclosed. The self-test executes a sequence of sub-tests and issues a state transition signal upon the execution of each sub-test. The self-test may be combined with a watchdog activity monitor (WAM) which provides a test-failure signal in the presence of a counted number of state transitions not agreeing with an expected number. An independent measure of time may be provided in the WAM to increase fault coverage by checking the processor's clock. Additionally, redundant processor systems are protected from inadvertent unsevering of a severed processor using a unique unsever arming technique and apparatus.

  4. Polarimetric measurements of natural surfaces at 95 GHz

    NASA Astrophysics Data System (ADS)

    Chang, Paul S.; McIntosh, Robert E.

    1992-08-01

    A high power 95 GHz radar system, developed at the University of Massachusetts, was used to make polarimetric measurements of natural surfaces. Over the two year period of this grant, the following items were accomplished: (1) The 95 GHz radar was configured into a unique system capable of simultaneously making coherent and incoherent Mueller matrix measurements; (2) The equivalence of the coherent and noncoherent measurement technique was demonstrated; (3) The polarimetric properties of various foliage targets were characterized. These included the weeping willow, the sugar maple, and the white pine tree species; (4) The polarimetric properties of various snowcover types were characterized; and (5) Mueller matrix models for wet and dry snow were developed.

  5. Mobile robot dynamic path planning based on improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Yong; Zhou, Heng; Wang, Ying

    2017-08-01

    In dynamic unknown environment, the dynamic path planning of mobile robots is a difficult problem. In this paper, a dynamic path planning method based on genetic algorithm is proposed, and a reward value model is designed to estimate the probability of dynamic obstacles on the path, and the reward value function is applied to the genetic algorithm. Unique coding techniques reduce the computational complexity of the algorithm. The fitness function of the genetic algorithm fully considers three factors: the security of the path, the shortest distance of the path and the reward value of the path. The simulation results show that the proposed genetic algorithm is efficient in all kinds of complex dynamic environments.

  6. A parametric LQ approach to multiobjective control system design

    NASA Technical Reports Server (NTRS)

    Kyr, Douglas E.; Buchner, Marc

    1988-01-01

    The synthesis of a constant parameter output feedback control law of constrained structure is set in a multiple objective linear quadratic regulator (MOLQR) framework. The use of intuitive objective functions such as model-following ability and closed-loop trajectory sensitivity, allow multiple objective decision making techniques, such as the surrogate worth tradeoff method, to be applied. For the continuous-time deterministic problem with an infinite time horizon, dynamic compensators as well as static output feedback controllers can be synthesized using a descent Anderson-Moore algorithm modified to impose linear equality constraints on the feedback gains by moving in feasible directions. Results of three different examples are presented, including a unique reformulation of the sensitivity reduction problem.

  7. Fermilab Muon g-2 Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorringe, Tim

    The Fermilab muon g-2 experiment will measure the muon anomalous magnetic momentmore » $$a_{\\mu}$$ to 140 ppb – a four-fold improvement over the earlier Brookhaven experiment. The measurement of $$a_{\\mu}$$ is well known as a unique test of the standard model with broad sensitivity to new interactions, particles and phenomena. The goal of 140 ppb is commensurate with ongoing improvements in the SM prediction of the anomalous moment and addresses the longstanding 3.5$$\\sigma$$ discrepancy between the BNL result and the SM prediction. In this article I discuss the physics motivation and experimental technique for measuring $$a_{\\mu}$$, and the current status and the future work for the project.« less

  8. Modeling spatially localized photonic nanojets from phase diffraction gratings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geints, Yu. E., E-mail: ygeints@iao.ru; Tomsk State University, 36, Lenina Avenue, Tomsk 634050; Zemlyanov, A. A.

    2016-04-21

    We investigated numerically the specific spatially localized intense optical structure, a photonic nanojet (PNJ), formed in the near-field scattering of optical radiation at phase diffraction gratings. The finite-difference time-domain technique was employed to study the PNJ key parameters (length, width, focal distance, and intensity) produced by diffraction gratings with the saw-tooth, rectangle, and hemispheric line profiles. Our analysis showed that each type of diffraction gratings produces a photonic jet with unique characteristics. Based on the numerical calculations, we demonstrate that the PNJ could be manipulated in a wide range through the variation of period, duty cycle, and shape of diffractionmore » grating rulings.« less

  9. Transvenous Embolization of a Spontaneous Femoral AVF 5 Years After an Incomplete Treatment with Arterial Stent-Grafts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peynircioglu, Bora; Ozkan, Murat; Dogan, Omer Faruk

    2008-03-15

    A 66-year-old man with complex left femoral arterio-venous fistula (AVF) was first diagnosed after a deep venous thrombosis incident approximately 5 years ago. Partial treatment was performed by means of endografts along the superficial femoral artery, which remained patent for 5 years. The patient had been doing well until a couple of months ago when he developed severe venous stasis and ulcers of the left cruris, due to a high-flow nonhealing complex AVF with additional iliac vein occlusion. Therefore; the definitive treatment was performed by a unique endovascular technique combined with surgical venous bypass (femoro-femoral crossover saphenous bypass, the Palmamore » operation). A novel percutaneous transvenous technique for occlusion of a complex high-flow AVF is reported with a review of the literature. The case is unique with spontaneous AVF, transvenous embolization with detachable coils and ONYX, and the hybrid treatment technique as well as the long-term patency of superficial femoral artery stent-grafts.« less

  10. Broad-search algorithms for finding triple-and quadruple-satellite-aided captures at Jupiter from 2020 to 2080

    NASA Astrophysics Data System (ADS)

    Lynam, Alfred E.

    2015-04-01

    Multiple-satellite-aided capture is a -efficient technique for capturing a spacecraft into orbit at Jupiter. However, finding the times when the Galilean moons of Jupiter align such that three or four of them can be encountered in a single pass is difficult using standard astrodynamics algorithms such as Lambert's problem. In this paper, we present simple but powerful techniques that simplify the dynamics and geometry of the Galilean satellites so that many of these triple- and quadruple-satellite-aided capture sequences can be found quickly over an extended 60-year time period from 2020 to 2080. The techniques find many low-fidelity trajectories that could be used as initial guesses for future high-fidelity optimization. Results indicate the existence of approximately 3,100 unique triple-satellite-aided capture trajectories and 6 unique quadruple-satellite-aided capture trajectories during the 60-year time period. The entire search takes less than one minute of computational time.

  11. Existence of a coupled system of fractional differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Rabha W.; Siri, Zailan

    2015-10-22

    We manage the existence and uniqueness of a fractional coupled system containing Schrödinger equations. Such a system appears in quantum mechanics. We confirm that the fractional system under consideration admits a global solution in appropriate functional spaces. The solution is shown to be unique. The method is based on analytic technique of the fixed point theory. The fractional differential operator is considered from the virtue of the Riemann-Liouville differential operator.

  12. Auditory and Vestibular Issues Related to Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Danielson, Richard W.; Wood, Scott J.

    2009-01-01

    Human spaceflight provides unique opportunities to study human vestibular and auditory systems. This session will discuss 1) vestibular adaptive processes reflected by pronounced perceptual and motor coordination problems during, and after, space missions; 2) vestibular diagnostic and rehabilitative techniques (used to promote recovery after living in altered gravity environments) that may be relevant to treatment of vestibular disorders on earth; and 3) unique acoustical challenges to hearing loss prevention and crew performance during spaceflight missions.

  13. CellTree: an R/bioconductor package to infer the hierarchical structure of cell populations from single-cell RNA-seq data.

    PubMed

    duVerle, David A; Yotsukura, Sohiya; Nomura, Seitaro; Aburatani, Hiroyuki; Tsuda, Koji

    2016-09-13

    Single-cell RNA sequencing is fast becoming one the standard method for gene expression measurement, providing unique insights into cellular processes. A number of methods, based on general dimensionality reduction techniques, have been suggested to help infer and visualise the underlying structure of cell populations from single-cell expression levels, yet their models generally lack proper biological grounding and struggle at identifying complex differentiation paths. Here we introduce cellTree: an R/Bioconductor package that uses a novel statistical approach, based on document analysis techniques, to produce tree structures outlining the hierarchical relationship between single-cell samples, while identifying latent groups of genes that can provide biological insights. With cellTree, we provide experimentalists with an easy-to-use tool, based on statistically and biologically-sound algorithms, to efficiently explore and visualise single-cell RNA data. The cellTree package is publicly available in the online Bionconductor repository at: http://bioconductor.org/packages/cellTree/ .

  14. Course-based undergraduate research experiences in molecular biosciences-patterns, trends, and faculty support.

    PubMed

    Wang, Jack T H

    2017-08-15

    Inquiry-driven learning, research internships and course-based undergraduate research experiences all represent mechanisms through which educators can engage undergraduate students in scientific research. In life sciences education, the benefits of undergraduate research have been thoroughly evaluated, but limitations in infrastructure and training can prevent widespread uptake of these practices. It is not clear how faculty members can integrate complex laboratory techniques and equipment into their unique context, while finding the time and resources to implement undergraduate research according to best practice guidelines. This review will go through the trends and patterns in inquiry-based undergraduate life science projects with particular emphasis on molecular biosciences-the research-aligned disciplines of biochemistry, molecular cell biology, microbiology, and genomics and bioinformatics. This will provide instructors with an overview of the model organisms, laboratory techniques and research questions that are adaptable for semester-long projects, and serve as starting guidelines for course-based undergraduate research. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Heat conduction in periodic laminates with probabilistic distribution of material properties

    NASA Astrophysics Data System (ADS)

    Ostrowski, Piotr; Jędrysiak, Jarosław

    2017-04-01

    This contribution deals with a problem of heat conduction in a two-phase laminate made of periodically distributed micro-laminas along one direction. In general, the Fourier's Law describing the heat conduction in a considered composite has highly oscillating and discontinuous coefficients. Therefore, the tolerance averaging technique (cf. Woźniak et al. in Thermomechanics of microheterogeneous solids and structures. Monografie - Politechnika Łódzka, Wydawnictwo Politechniki Łódzkiej, Łódź, 2008) is applied. Based on this technique, the averaged differential equations for a tolerance-asymptotic model are derived and solved analytically for given initial-boundary conditions. The second part of this contribution is an investigation of the effect of material properties ratio ω of two components on the total temperature field θ, by the assumption that conductivities of micro-laminas are not necessary uniquely described. Numerical experiments (Monte Carlo simulation) are executed under assumption that ω is a random variable with a fixed probability distribution. At the end, based on the obtained results, a crucial hypothesis is formulated.

  16. Radiomics: a new application from established techniques

    PubMed Central

    Parekh, Vishwa; Jacobs, Michael A.

    2016-01-01

    The increasing use of biomarkers in cancer have led to the concept of personalized medicine for patients. Personalized medicine provides better diagnosis and treatment options available to clinicians. Radiological imaging techniques provide an opportunity to deliver unique data on different types of tissue. However, obtaining useful information from all radiological data is challenging in the era of “big data”. Recent advances in computational power and the use of genomics have generated a new area of research termed Radiomics. Radiomics is defined as the high throughput extraction of quantitative imaging features or texture (radiomics) from imaging to decode tissue pathology and creating a high dimensional data set for feature extraction. Radiomic features provide information about the gray-scale patterns, inter-pixel relationships. In addition, shape and spectral properties can be extracted within the same regions of interest on radiological images. Moreover, these features can be further used to develop computational models using advanced machine learning algorithms that may serve as a tool for personalized diagnosis and treatment guidance. PMID:28042608

  17. Acculturation, Income and Vegetable Consumption Behaviors Among Latino Adults in the U.S.: A Mediation Analysis with the Bootstrapping Technique.

    PubMed

    López, Erick B; Yamashita, Takashi

    2017-02-01

    This study examined whether household income mediates the relationship between acculturation and vegetable consumption among Latino adults in the U.S. Data from the 2009 to 2010 National Health and Nutrition Examination Survey were analyzed. Vegetable consumption index was created based on the frequencies of five kinds of vegetables intake. Acculturation was measured with the degree of English language use at home. Path model with bootstrapping technique was employed for mediation analysis. A significant partial mediation relationship was identified. Greater acculturation [95 % bias corrected bootstrap confident interval (BCBCI) = (0.02, 0.33)] was associated with the higher income and in turn, greater vegetable consumption. At the same time, greater acculturation was associated with lower vegetable consumption [95 % BCBCI = (-0.88, -0.07)]. Findings regarding the income as a mediator of the acculturation-dietary behavior relationship inform unique intervention programs and policy changes to address health disparities by race/ethnicity.

  18. Neutral atom traps of rare isotopes

    NASA Astrophysics Data System (ADS)

    Mueller, Peter

    2016-09-01

    Laser cooling and trapping techniques offer exquisite control of an atom's external and internal degrees of freedom. The species of interest can be selectively captured, cooled close to absolute zero temperatures, and observed with high signal-to-noise ratio. Moreover, the atom's electronic and magnetic state populations can be precisely manipulated and interrogated. Applied in nuclear physics, these techniques are ideal for precision measurements in the fields of fundamental interactions and symmetries, nuclear structure studies, and isotopic trace analysis. In particular, they offer unique opportunities in the quest for physics beyond the standard model. I will shortly review the basics of this approach and the state of the field and then cover in more details recent results from two such efforts: the search for a permanent electric dipole moment in 225Ra and the beta-neutrino angular correlation measurement with laser trapped 6He. This work is supported by the U.S. DOE, Office of Science, Office of Nuclear Physics, under Contract DE-AC02-06CH11357.

  19. Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem.

    PubMed

    Rajeswari, M; Amudhavel, J; Pothula, Sujatha; Dhavachelvan, P

    2017-01-01

    The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria.

  20. Analysis and automatic identification of sleep stages using higher order spectra.

    PubMed

    Acharya, U Rajendra; Chua, Eric Chern-Pin; Chua, Kuang Chua; Min, Lim Choo; Tamura, Toshiyo

    2010-12-01

    Electroencephalogram (EEG) signals are widely used to study the activity of the brain, such as to determine sleep stages. These EEG signals are nonlinear and non-stationary in nature. It is difficult to perform sleep staging by visual interpretation and linear techniques. Thus, we use a nonlinear technique, higher order spectra (HOS), to extract hidden information in the sleep EEG signal. In this study, unique bispectrum and bicoherence plots for various sleep stages were proposed. These can be used as visual aid for various diagnostics application. A number of HOS based features were extracted from these plots during the various sleep stages (Wakefulness, Rapid Eye Movement (REM), Stage 1-4 Non-REM) and they were found to be statistically significant with p-value lower than 0.001 using ANOVA test. These features were fed to a Gaussian mixture model (GMM) classifier for automatic identification. Our results indicate that the proposed system is able to identify sleep stages with an accuracy of 88.7%.

  1. A single cell high content assay detects mitochondrial dysfunction in iPSC-derived neurons with mutations in SNCA.

    PubMed

    Little, Daniel; Luft, Christin; Mosaku, Olukunbi; Lorvellec, Maëlle; Yao, Zhi; Paillusson, Sébastien; Kriston-Vizi, Janos; Gandhi, Sonia; Abramov, Andrey Y; Ketteler, Robin; Devine, Michael J; Gissen, Paul

    2018-06-13

    Mitochondrial dysfunction is implicated in many neurodegenerative diseases including Parkinson's disease (PD). Induced pluripotent stem cells (iPSCs) provide a unique cell model for studying neurological diseases. We have established a high-content assay that can simultaneously measure mitochondrial function, morphology and cell viability in iPSC-derived dopaminergic neurons. iPSCs from PD patients with mutations in SNCA and unaffected controls were differentiated into dopaminergic neurons, seeded in 384-well plates and stained with the mitochondrial membrane potential dependent dye TMRM, alongside Hoechst-33342 and Calcein-AM. Images were acquired using an automated confocal screening microscope and single cells were analysed using automated image analysis software. PD neurons displayed reduced mitochondrial membrane potential and altered mitochondrial morphology compared to control neurons. This assay demonstrates that high content screening techniques can be applied to the analysis of mitochondria in iPSC-derived neurons. This technique could form part of a drug discovery platform to test potential new therapeutics for PD and other neurodegenerative diseases.

  2. Resolving bathymetry from airborne gravity along Greenland fjords

    USGS Publications Warehouse

    Boghosian, Alexandra; Tinto, Kirsty; Cochran, James R.; Porter, David; Elieff, Stefan; Burton, Bethany L.; Bell, Robin E.

    2015-01-01

    Recent glacier mass loss in Greenland has been attributed to encroaching warming waters, but knowledge of fjord bathymetry is required to investigate this mechanism. The bathymetry in many Greenland fjords is unmapped and difficult to measure. From 2010 to 2012, National Aeronautics and Space Administration's Operation IceBridge collected a unique set of airborne gravity, magnetic, radar, and lidar data along the major outlet glaciers and fjords in Greenland. We applied a consistent technique using the IceBridge gravity data to create 90 bathymetric profiles along 54 Greenland fjords. We also used this technique to recover subice topography where warm or crevassed ice prevents the radar system from imaging the bed. Here we discuss our methodology, basic assumptions and error analysis. We present the new bathymetry data and discuss observations in six major regions of Greenland covered by IceBridge. The gravity models provide a total of 1950 line kilometers of bathymetry, 875 line kilometers of subice topography, and 12 new grounding line depths.

  3. Improving quantitative structure-activity relationship models using Artificial Neural Networks trained with dropout.

    PubMed

    Mendenhall, Jeffrey; Meiler, Jens

    2016-02-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both enrichment false positive rate and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22-46 % over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods.

  4. Improving Quantitative Structure-Activity Relationship Models using Artificial Neural Networks Trained with Dropout

    PubMed Central

    Mendenhall, Jeffrey; Meiler, Jens

    2016-01-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery (LB-CADD) pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both Enrichment false positive rate (FPR) and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22–46% over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods. PMID:26830599

  5. Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem

    PubMed Central

    Amudhavel, J.; Pothula, Sujatha; Dhavachelvan, P.

    2017-01-01

    The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria. PMID:28473849

  6. Cranioplasty with individual titanium implants

    NASA Astrophysics Data System (ADS)

    Mishinov, S.; Stupak, V.; Sadovoy, M.; Mamonova, E.; Koporushko, N.; Larkin, V.; Novokshonov, A.; Dolzhenko, D.; Panchenko, A.; Desyatykh, I.; Krasovsky, I.

    2017-09-01

    Cranioplasty is the second procedure in the history of neurosurgery after trepanation, and it is still relevant despite the development of civilization and progress in medicine. Each cranioplasty operation is unique because there are no two patients with identical defects of the skull bones. The development of Direct Metal Laser Sintering (DMLS) technique opened up the possibility of direct implant printing of titanium, a biocompatible metal used in medicine. This eliminates the need for producing any intermediate products to create the desired implant. We have produced 8 patient-specific titanium implants using this technique for patients who underwent different decompressive cranioectomies associated with bone tumors. Follow-up duration ranged from 6 to 12 months. We observed no implant-related reactions or complications. In all cases of reconstructive neurosurgery we achieved good clinical and aesthetic results. The analysis of the literature and our own experience in three-dimensional modeling, prototyping, and printing suggests that direct laser sintering of titanium is the optimal method to produce biocompatible surgical implants.

  7. A method for measuring aircraft height and velocity using dual television cameras

    NASA Technical Reports Server (NTRS)

    Young, W. R.

    1977-01-01

    A unique electronic optical technique, consisting of two closed circuit television cameras and timing electronics, was devised to measure an aircraft's horizontal velocity and height above ground without the need for airborne cooperative devices. The system is intended to be used where the aircraft has a predictable flight path and a height of less than 660 meters (2,000 feet) at or near the end of an air terminal runway, but is suitable for greater aircraft altitudes whenever the aircraft remains visible. Two television cameras, pointed at zenith, are placed in line with the expected path of travel of the aircraft. Velocity is determined by measuring the time it takes the aircraft to travel the measured distance between cameras. Height is determined by correlating this speed with the time required to cross the field of view of either camera. Preliminary tests with a breadboard version of the system and a small model aircraft indicate the technique is feasible.

  8. Investigation of oscillating cascade aerodynamics by an experimental influence coefficient technique

    NASA Technical Reports Server (NTRS)

    Buffum, Daniel H.; Fleeter, Sanford

    1988-01-01

    Fundamental experiments are performed in the NASA Lewis Transonic Oscillating Cascade Facility to investigate the torsion mode unsteady aerodynamics of a biconvex airfoil cascade at realistic values of the reduced frequency for all interblade phase angles at a specified mean flow condition. In particular, an unsteady aerodynamic influence coefficient technique is developed and utilized in which only one airfoil in the cascade is oscillated at a time and the resulting airfoil surface unsteady pressure distribution measured on one dynamically instrumented airfoil. The unsteady aerodynamics of an equivalent cascade with all airfoils oscillating at a specified interblade phase angle are then determined through a vector summation of these data. These influence coefficient determined oscillation cascade data are correlated with data obtained in this cascade with all airfoils oscillating at several interblade phase angle values. The influence coefficients are then utilized to determine the unsteady aerodynamics of the cascade for all interblade phase angles, with these unique data subsequently correlated with predictions from a linearized unsteady cascade model.

  9. Waste biomass toward hydrogen fuel supply chain management for electricity: Malaysia perspective

    NASA Astrophysics Data System (ADS)

    Zakaria, Izatul Husna; Ibrahim, Jafni Azhan; Othman, Abdul Aziz

    2016-08-01

    Green energy is becoming an important aspect of every country in the world toward energy security by reducing dependence on fossil fuel import and enhancing better life quality by living in the healthy environment. This conceptual paper is an approach toward determining physical flow's characteristic of waste wood biomass in high scale plantation toward producing gas fuel for electricity using gasification technique. The scope of this study is supply chain management of syngas fuel from wood waste biomass using direct gasification conversion technology. Literature review on energy security, Malaysia's energy mix, Biomass SCM and technology. This paper uses the theoretical framework of a model of transportation (Lumsden, 2006) and the function of the terminal (Hulten, 1997) for research purpose. To incorporate biomass unique properties, Biomass Element Life Cycle Analysis (BELCA) which is a novel technique develop to understand the behaviour of biomass supply. Theoretical framework used to answer the research questions are Supply Chain Operations Reference (SCOR) framework and Sustainable strategy development in supply chain management framework

  10. Longitudinal Monitoring of Antibody Responses against Tumor Cells Using Magneto-nanosensors with a Nanoliter of Blood.

    PubMed

    Lee, Jung-Rok; Chan, Carmel T; Ruderman, Daniel; Chuang, Hui-Yen; Gaster, Richard S; Atallah, Michelle; Mallick, Parag; Lowe, Scott W; Gambhir, Sanjiv S; Wang, Shan X

    2017-11-08

    Each immunoglobulin isotype has unique immune effector functions. The contribution of these functions in the elimination of pathogens and tumors can be determined by monitoring quantitative temporal changes in isotype levels. Here, we developed a novel technique using magneto-nanosensors based on the effect of giant magnetoresistance (GMR) for longitudinal monitoring of total and antigen-specific isotype levels with high precision, using as little as 1 nL of serum. Combining in vitro serologic measurements with in vivo imaging techniques, we investigated the role of the antibody response in the regression of firefly luciferase (FL)-labeled lymphoma cells in spleen, kidney, and lymph nodes in a syngeneic Burkitt's lymphoma mouse model. Regression status was determined by whole body bioluminescent imaging (BLI). The magneto-nanosensors revealed that anti-FL IgG2a and total IgG2a were elevated and sustained in regression mice compared to non-regression mice (p < 0.05). This platform shows promise for monitoring immunotherapy, vaccination, and autoimmunity.

  11. Pseudorandom Noise Code-Based Technique for Cloud and Aerosol Discrimination Applications

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.; Harrison, Fenton Wallace

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a PN code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths. Keywords: ASCENDS, CO2 sensing, O2 sensing, PN codes, CW lidar

  12. NON-INVASIVE EVALUATION OF NERVE CONDUCTION IN SMALL DIAMETER FIBERS IN THE RAT.

    PubMed

    Zotova, Elena G; Arezzo, Joseph C

    2013-01-01

    A novel non-invasive technique was applied to measure velocity within slow conducting axons in the distal extreme of the sciatic nerve (i.e., digital nerve) in a rat model. The technique is based on the extraction of rectified multiple unit activity (MUA) from in vivo whole nerve compound responses. This method reliably identifies compound action potentials in thinly myelinated fibers conducting at a range of 9-18 m/s (Aδ axons), as well as in a subgroup of unmylinated C fibers conducting at approximately 1-2 m/s. The sensitivity of the method to C-fiber conduction was confirmed by the progressive decrement of the responses in the 1-2 m/s range over a 20-day period following the topical application of capsaicin (ANOVA p <0.03). Increasing the frequency of applied repetitive stimulation over a range of 0.75 Hz to 6.0 Hz produced slowing of conduction and a significant decrease in the magnitude of the compound C-fiber response (ANOVA p <0.01). This technique offers a unique opportunity for the non-invasive, repeatable, and quantitative assessment of velocity in the subsets of Aδ and C fibers in parallel with evaluation of fast nerve conduction.

  13. Splitting a colon geometry with multiplanar clipping

    NASA Astrophysics Data System (ADS)

    Ahn, David K.; Vining, David J.; Ge, Yaorong; Stelts, David R.

    1998-06-01

    Virtual colonoscopy, a recent three-dimensional (3D) visualization technique, has provided radiologists with a unique diagnostic tool. Using this technique, a radiologist can examine the internal morphology of a patient's colon by navigating through a surface-rendered model that is constructed from helical computed tomography image data. Virtual colonoscopy can be used to detect early forms of colon cancer in a way that is less invasive and expensive compared to conventional endoscopy. However, the common approach of 'flying' through the colon lumen to visually search for polyps is tedious and time-consuming, especially when a radiologist loses his or her orientation within the colon. Furthermore, a radiologist's field of view is often limited by the 3D camera position located inside the colon lumen. We have developed a new technique, called multi-planar geometry clipping, that addresses these problems. Our algorithm divides a complex colon anatomy into several smaller segments, and then splits each of these segments in half for display on a static medium. Multi-planar geometry clipping eliminates virtual colonoscopy's dependence upon expensive, real-time graphics workstations by enabling radiologists to globally inspect the entire internal surface of the colon from a single viewpoint.

  14. Metabolic adaptations of overwintering European common lizards (Lacerta vivipara).

    PubMed

    Voituron, Y; Hérold, J P; Grenot, C

    2000-01-01

    The European common lizard Lacerta vivipara, a reptile of cold-temperate climates, provides us an interesting model of low-temperature adaptation. Indeed its unique cold-hardiness strategy, which employs both freeze tolerance and freeze avoidance, may be seen as the primary reason for its large distribution, which extends from Spain to beyond the Arctic circle. To study the metabolism supporting this capacity, we used three techniques: two techniques of calorimetry (oxygen consumption and thermogenesis) and nuclear magnetic resonance spectroscopy. These techniques were used to examine the metabolic balance and the different molecular pathways used between three different periods through the year (September, January, and May). The results show a significant 20% augmentation of winter anaerobic metabolism compared to other periods of the year. This is mainly because of an activation of the lactic fermentation pathway leading to an increase of lactate concentration (>34% in winter). Furthermore, glucose, which increases some 245% in winter, is used as antifreeze and metabolic substrate. Furthermore, this study provides evidence that the physiological adaptations of the common lizard differ from those of other ectotherms such as Rana sylvatica. Concentrations of alanine and glycerol, commonly used as antifreeze by many overwintering ectotherms, do not increase during winter.

  15. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    NASA Astrophysics Data System (ADS)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary data streams or by considering longer observation windows no systematic analysis has been carried out so far to explain the large differences among results. We consider adjoint based methods to investigate inverse problems using DALEC and various data streams. Using resolution matrices we study the nature of the inverse problems (solution existence, uniqueness and stability) and show how standard regularization techniques affect resolution and stability properties. Instead of using standard prior information as a penalty term in the cost function to regularize the problems we constraint the parameter space using ecological balance conditions and inequality constraints. The efficiency and rapidity of this approach allows us to compute ensembles of solutions to the inverse problems from which we can establish the robustness of the variational method and obtain non Gaussian posterior distributions for the model parameters and initial carbon stocks.

  16. An efficient approach to ARMA modeling of biological systems with multiple inputs and delays

    NASA Technical Reports Server (NTRS)

    Perrott, M. H.; Cohen, R. J.

    1996-01-01

    This paper presents a new approach to AutoRegressive Moving Average (ARMA or ARX) modeling which automatically seeks the best model order to represent investigated linear, time invariant systems using their input/output data. The algorithm seeks the ARMA parameterization which accounts for variability in the output of the system due to input activity and contains the fewest number of parameters required to do so. The unique characteristics of the proposed system identification algorithm are its simplicity and efficiency in handling systems with delays and multiple inputs. We present results of applying the algorithm to simulated data and experimental biological data In addition, a technique for assessing the error associated with the impulse responses calculated from estimated ARMA parameterizations is presented. The mapping from ARMA coefficients to impulse response estimates is nonlinear, which complicates any effort to construct confidence bounds for the obtained impulse responses. Here a method for obtaining a linearization of this mapping is derived, which leads to a simple procedure to approximate the confidence bounds.

  17. Surgical Models of Roux-en-Y Gastric Bypass Surgery and Sleeve Gastrectomy in Rats and Mice

    PubMed Central

    Bruinsma, Bote G.; Uygun, Korkut; Yarmush, Martin L.; Saeidi, Nima

    2015-01-01

    Bariatric surgery is the only definitive solution currently available for the present obesity pandemic. These operations typically involve reconfiguration of gastrointestinal tract anatomy and impose profound metabolic and physiological benefits, such as substantially reducing body weight and ameliorating type II diabetes. Therefore, animal models of these surgeries offer unique and exciting opportunities to delineate the underlying mechanisms that contribute to the resolution of obesity and diabetes. Here we describe a standardized procedure for mouse and rat models of Roux-en-Y gastric bypass (80–90 minutes operative time) and sleeve gastrectomy (30–45 minutes operative time), which, to a high degree resemble operations in human. We also provide detailed protocols for both pre- and post-operative techniques that ensure a high success rate in the operations. These protocols provide the opportunity to mechanistically investigate the systemic effects of the surgical interventions, such as regulation of body weight, glucose homeostasis, and gut microbiome. PMID:25719268

  18. Development and Implementation of Dynamic Scripts to Execute Cycled GSI/WRF Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Xuanli; Watson, Leela

    2014-01-01

    The Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model and Gridpoint Statistical Interpolation (GSI) data assimilation (DA) are the operational systems that make up the North American Mesoscale (NAM) model and the NAM Data Assimilation System (NDAS) analysis used by National Weather Service forecasters. The Developmental Testbed Center (DTC) manages and distributes the code for the WRF and GSI, but it is up to individual researchers to link the systems together and write scripts to run the systems, which can take considerable time for those not familiar with the code. The objective of this project is to develop and disseminate a set of dynamic scripts that mimic the unique cycling configuration of the operational NAM to enable researchers to develop new modeling and data assimilation techniques that can be easily transferred to operations. The current version of the SPoRT GSI/WRF Scripts (v3.0.1) is compatible with WRF v3.3 and GSI v3.0.

  19. Accounting for observed small angle X-ray scattering profile in the protein-protein docking server ClusPro.

    PubMed

    Xia, Bing; Mamonov, Artem; Leysen, Seppe; Allen, Karen N; Strelkov, Sergei V; Paschalidis, Ioannis Ch; Vajda, Sandor; Kozakov, Dima

    2015-07-30

    The protein-protein docking server ClusPro is used by thousands of laboratories, and models built by the server have been reported in over 300 publications. Although the structures generated by the docking include near-native ones for many proteins, selecting the best model is difficult due to the uncertainty in scoring. Small angle X-ray scattering (SAXS) is an experimental technique for obtaining low resolution structural information in solution. While not sufficient on its own to uniquely predict complex structures, accounting for SAXS data improves the ranking of models and facilitates the identification of the most accurate structure. Although SAXS profiles are currently available only for a small number of complexes, due to its simplicity the method is becoming increasingly popular. Since combining docking with SAXS experiments will provide a viable strategy for fairly high-throughput determination of protein complex structures, the option of using SAXS restraints is added to the ClusPro server. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  20. Extension of the hole-drilling method to birefringent composites

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    1982-01-01

    A complete stress analysis and reliable failure criteria are essential for important structural applications of composites in order to fully utilize their unique properties. The inhomogeneity, anisotropy and inelasticity of many composites make the use of experimental methods indispensable. Among the experimental techniques, transmission photoelasticity has been extended to birefringent composites in recent years. The extension is not straight-forward, in view of the complex nature of the photoelastic response of such model materials. This paper very briefly reviews the important developments in the subject and then describes the theoretical basis for a new method of determining the individual values of principal stresses in composite models. The method consists in drilling very small holes at points where the state of stress has to be determined. Experiments are then described which verify the theoretical predictions. The limitations of the method are pointed out and it is concluded that valuable information concerning the state of stress in a composite model can be obtained through the suggested method.

  1. Ares-I-X Stability and Control Flight Test: Analysis and Plans

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Derry, Stephen D.; Heim, Eugene H.; Hueschen, Richard M.; Bacon, Barton J.

    2008-01-01

    The flight test of the Ares I-X vehicle provides a unique opportunity to reduce risk of the design of the Ares I vehicle and test out design, math modeling, and analysis methods. One of the key features of the Ares I design is the significant static aerodynamic instability coupled with the relatively flexible vehicle - potentially resulting in a challenging controls problem to provide adequate flight path performance while also providing adequate structural mode damping and preventing adverse control coupling to the flexible structural modes. Another challenge is to obtain enough data from the single flight to be able to conduct analysis showing the effectiveness of the controls solutions and have data to inform design decisions for Ares I. This paper will outline the modeling approaches and control system design to conduct this flight test, and also the system identification techniques developed to extract key information such as control system performance (gain/phase margins, for example), structural dynamics responses, and aerodynamic model estimations.

  2. Longitudinal Evaluation of Fatty Acid Metabolism in Normal and Spontaneously Hypertensive Rat Hearts with Dynamic MicroSPECT Imaging

    DOE PAGES

    Reutter, Bryan W.; Huesman, Ronald H.; Brennan, Kathleen M.; ...

    2011-01-01

    The goal of this project is to develop radionuclide molecular imaging technologies using a clinical pinhole SPECT/CT scanner to quantify changes in cardiac metabolism using the spontaneously hypertensive rat (SHR) as a model of hypertensive-related pathophysiology. This paper quantitatively compares fatty acid metabolism in hearts of SHR and Wistar-Kyoto normal rats as a function of age and thereby tracks physiological changes associated with the onset and progression of heart failure in the SHR model. The fatty acid analog, 123 I-labeled BMIPP, was used in longitudinal metabolic pinhole SPECT imaging studies performed every seven months for 21 months. The uniqueness ofmore » this project is the development of techniques for estimating the blood input function from projection data acquired by a slowly rotating camera that is imaging fast circulation and the quantification of the kinetics of 123 I-BMIPP by fitting compartmental models to the blood and tissue time-activity curves.« less

  3. Unique quadruple immunofluorescence assay demonstrates mitochondrial respiratory chain dysfunction in osteoblasts of aged and PolgA(-/-) mice.

    PubMed

    Dobson, Philip F; Rocha, Mariana C; Grady, John P; Chrysostomou, Alexia; Hipps, Daniel; Watson, Sharon; Greaves, Laura C; Deehan, David J; Turnbull, Doug M

    2016-08-24

    Fragility fractures caused by osteoporosis affect millions of people worldwide every year with significant levels of associated morbidity, mortality and costs to the healthcare economy. The pathogenesis of declining bone mineral density is poorly understood but it is inherently related to increasing age. Growing evidence in recent years, especially that provided by mouse models, suggest that accumulating somatic mitochondrial DNA mutations may cause the phenotypic changes associated with the ageing process including osteoporosis. Methods to study mitochondrial abnormalities in individual osteoblasts, osteoclasts and osteocytes are limited and impair our ability to assess the changes seen with age and in animal models of ageing. To enable the assessment of mitochondrial protein levels, we have developed a quadruple immunofluorescence method to accurately quantify the presence of mitochondrial respiratory chain components within individual bone cells. We have applied this technique to a well-established mouse model of ageing and osteoporosis and show respiratory chain deficiency.

  4. Induction heating process of ferromagnetic filled carbon nanotubes based on 3-D model

    NASA Astrophysics Data System (ADS)

    Wiak, Sławomir; Firych-Nowacka, Anna; Smółka, Krzysztof; Pietrzak, Łukasz; Kołaciński, Zbigniew; Szymański, Łukasz

    2017-12-01

    Since their discovery by Iijima in 1991 [1], carbon nanotubes have sparked unwavering interest among researchers all over the world. This is due to the unique properties of carbon nanotubes (CNTs). Carbon nanotubes have excellent mechanical and electrical properties with high chemical and thermal stability. In addition, carbon nanotubes have a very large surface area and are hollow inside. This gives a very broad spectrum of nanotube applications, such as in combination with polymers as polymer composites in the automotive, aerospace or textile industries. At present, many methods of nanotube synthesis are known [2, 3, 4, 5, 6]. It is also possible to use carbon nanotubes in biomedical applications [7, 8, 9, 10, 11, 12, 13, 14], including the destruction of cancer cells using iron-filled carbon nanotubes in the hyperthermia process. Computer modelling results of Fe-CNTs induction heating process are presented in the paper. As an object used for computer model creation, Fe-CNTs were synthesized by the authors using CCVD technique.

  5. Precision-cut intestinal slices: alternative model for drug transport, metabolism, and toxicology research.

    PubMed

    Li, Ming; de Graaf, Inge A M; Groothuis, Geny M M

    2016-01-01

    The absorption, distribution, metabolism, excretion and toxicity (ADME-tox) processes of drugs are of importance and require preclinical investigation intestine in addition to the liver. Various models have been developed for prediction of ADME-tox in the intestine. In this review, precision-cut intestinal slices (PCIS) are discussed and highlighted as model for ADME-tox studies. This review provides an overview of the applications and an update of the most recent research on PCIS as an ex vivo model to study the transport, metabolism and toxicology of drugs and other xenobiotics. The unique features of PCIS and the differences with other models as well as the translational aspects are also discussed. PCIS are a simple, fast, and reliable ex vivo model for drug ADME-tox research. Therefore, PCIS are expected to become an indispensable link in the in vitro-ex vivo-in vivo extrapolation, and a bridge in translation of animal data to the human situation. In the future, this model may be helpful to study the effects of interorgan interactions, intestinal bacteria, excipients and drug formulations on the ADME-tox properties of drugs. The optimization of culture medium and the development of a (cryo)preservation technique require more research.

  6. Model For Marketing Strategy Decision Based On Multicriteria Decicion Making: A Case Study In Batik Madura Industry

    NASA Astrophysics Data System (ADS)

    Anna, I. D.; Cahyadi, I.; Yakin, A.

    2018-01-01

    Selection of marketing strategy is a prominent competitive advantage for small and medium enterprises business development. The selection process is is a multiple criteria decision-making problem, which includes evaluation of various attributes or criteria in a process of strategy formulation. The objective of this paper is to develop a model for the selection of a marketing strategy in Batik Madura industry. The current study proposes an integrated approach based on analytic network process (ANP) and technique for order preference by similarity to ideal solution (TOPSIS) to determine the best strategy for Batik Madura marketing problems. Based on the results of group decision-making technique, this study selected fourteen criteria, including consistency, cost, trend following, customer loyalty, business volume, uniqueness manpower, customer numbers, promotion, branding, bussiness network, outlet location, credibility and the inovation as Batik Madura marketing strategy evaluation criteria. A survey questionnaire developed from literature review was distributed to a sample frame of Batik Madura SMEs in Pamekasan. In the decision procedure step, expert evaluators were asked to establish the decision matrix by comparing the marketing strategy alternatives under each of the individual criteria. Then, considerations obtained from ANP and TOPSIS methods were applied to build the specific criteria constraints and range of the launch strategy in the model. The model in this study demonstrates that, under current business situation, Straight-focus marketing strategy is the best marketing strategy for Batik Madura SMEs in Pamekasan.

  7. Meniscus repair using mesenchymal stem cells - a comprehensive review.

    PubMed

    Yu, Hana; Adesida, Adetola B; Jomha, Nadr M

    2015-04-30

    The menisci are a pair of semilunar fibrocartilage structures that play an essential role in maintaining normal knee function. Injury to the menisci can disrupt joint stability and lead to debilitating results. Because natural meniscal healing is limited, an efficient method of repair is necessary. Tissue engineering (TE) combines the principles of life sciences and engineering to restore the unique architecture of the native meniscus. Mesenchymal stem cells (MSCs) have been investigated for their therapeutic potential both in vitro and in vivo. This comprehensive review examines the English literature identified through a database search using Medline, Embase, Engineering Village, and SPORTDiscus. The search results were classified based on MSC type, animal model, and method of MSC delivery/culture. A variety of MSC types, including bone marrow-derived, synovium-derived, adipose-derived, and meniscus-derived MSCs, has been examined. Research results were categorized into and discussed by the different animal models used; namely murine, leporine, porcine, caprine, bovine, ovine, canine, equine, and human models of meniscus defect/repair. Within each animal model, studies were categorized further according to MSC delivery/culture techniques. These techniques included direct application, fibrin glue/gel/clot, intra-articular injection, scaffold, tissue-engineered construct, meniscus tissue, pellets/aggregates, and hydrogel. The purpose of this review is to inform the reader about the current state and advances in meniscus TE using MSCs. Future directions of MSC-based meniscus TE are also suggested to help guide prospective research.

  8. Theorists and Techniques: Connecting Education Theories to Lamaze Teaching Techniques

    PubMed Central

    Podgurski, Mary Jo

    2016-01-01

    ABSTRACT Should childbirth educators connect education theory to technique? Is there more to learning about theorists than memorizing facts for an assessment? Are childbirth educators uniquely poised to glean wisdom from theorists and enhance their classes with interactive techniques inspiring participant knowledge and empowerment? Yes, yes, and yes. This article will explore how an awareness of education theory can enhance retention of material through interactive learning techniques. Lamaze International childbirth classes already prepare participants for the childbearing year by using positive group dynamics; theory will empower childbirth educators to address education through well-studied avenues. Childbirth educators can provide evidence-based learning techniques in their classes and create true behavioral change. PMID:26848246

  9. A Novel Solution-Technique Applied to a Novel WAAS Architecture

    NASA Technical Reports Server (NTRS)

    Bavuso, J.

    1998-01-01

    The Federal Aviation Administration has embarked on an historic task of modernizing and significantly improving the national air transportation system. One system that uses the Global Positioning System (GPS) to determine aircraft navigational information is called the Wide Area Augmentation System (WAAS). This paper describes a reliability assessment of one candidate system architecture for the WAAS. A unique aspect of this study regards the modeling and solution of a candidate system that allows a novel cold sparing scheme. The cold spare is a WAAS communications satellite that is fabricated and launched after a predetermined number of orbiting satellite failures have occurred and after some stochastic fabrication time transpires. Because these satellites are complex systems with redundant components, they exhibit an increasing failure rate with a Weibull time to failure distribution. Moreover, the cold spare satellite build-time is Weibull and upon launch is considered to be a good-as-new system with an increasing failure rate and a Weibull time to failure distribution as well. The reliability model for this system is non-Markovian because three distinct system clocks are required: the time to failure of the orbiting satellites, the build time for the cold spare, and the time to failure for the launched spare satellite. A powerful dynamic fault tree modeling notation and Monte Carlo simulation technique with importance sampling are shown to arrive at a reliability prediction for a 10 year mission.

  10. Remote Distributed Vibration Sensing Through Opaque Media Using Permanent Magnets

    DOE PAGES

    Chen, Yi; Mazumdar, Anirban; Brooks, Carlton F.; ...

    2018-04-05

    Vibration sensing is critical for a variety of applications from structural fatigue monitoring to understanding the modes of airplane wings. In particular, remote sensing techniques are needed for measuring the vibrations of multiple points simultaneously, assessing vibrations inside opaque metal vessels, and sensing through smoke clouds and other optically challenging environments. Here, in this paper, we propose a method which measures high-frequency displacements remotely using changes in the magnetic field generated by permanent magnets. We leverage the unique nature of vibration tracking and use a calibrated local model technique developed specifically to improve the frequency-domain estimation accuracy. The results showmore » that two-dimensional local models surpass the dipole model in tracking high-frequency motions. A theoretical basis for understanding the effects of electronic noise and error due to correlated variables is generated in order to predict the performance of experiments prior to implementation. Simultaneous measurements of up to three independent vibrating components are shown. The relative accuracy of the magnet-based displacement tracking with respect to the video tracking ranges from 40 to 190 μm when the maximum displacements approach ±5 mm and when sensor-to-magnet distances vary from 25 to 36 mm. Finally, vibration sensing inside an opaque metal vessel and mode shape changes due to damage on an aluminum beam are also studied using the wireless permanent-magnet vibration sensing scheme.« less

  11. Remote Distributed Vibration Sensing Through Opaque Media Using Permanent Magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Mazumdar, Anirban; Brooks, Carlton F.

    Vibration sensing is critical for a variety of applications from structural fatigue monitoring to understanding the modes of airplane wings. In particular, remote sensing techniques are needed for measuring the vibrations of multiple points simultaneously, assessing vibrations inside opaque metal vessels, and sensing through smoke clouds and other optically challenging environments. Here, in this paper, we propose a method which measures high-frequency displacements remotely using changes in the magnetic field generated by permanent magnets. We leverage the unique nature of vibration tracking and use a calibrated local model technique developed specifically to improve the frequency-domain estimation accuracy. The results showmore » that two-dimensional local models surpass the dipole model in tracking high-frequency motions. A theoretical basis for understanding the effects of electronic noise and error due to correlated variables is generated in order to predict the performance of experiments prior to implementation. Simultaneous measurements of up to three independent vibrating components are shown. The relative accuracy of the magnet-based displacement tracking with respect to the video tracking ranges from 40 to 190 μm when the maximum displacements approach ±5 mm and when sensor-to-magnet distances vary from 25 to 36 mm. Finally, vibration sensing inside an opaque metal vessel and mode shape changes due to damage on an aluminum beam are also studied using the wireless permanent-magnet vibration sensing scheme.« less

  12. Data mining and visualization techniques

    DOEpatents

    Wong, Pak Chung [Richland, WA; Whitney, Paul [Richland, WA; Thomas, Jim [Richland, WA

    2004-03-23

    Disclosed are association rule identification and visualization methods, systems, and apparatus. An association rule in data mining is an implication of the form X.fwdarw.Y where X is a set of antecedent items and Y is the consequent item. A unique visualization technique that provides multiple antecedent, consequent, confidence, and support information is disclosed to facilitate better presentation of large quantities of complex association rules.

  13. Visual and x-ray inspection characteristics of eutectic and lead free assemblies

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    2003-01-01

    For high reliability applications, visual inspection has been the key technique for most conventional electronic package assemblies. Now, the use of x-ray technique has become an additional inspection requirement for quality control and detection of unique defects due to manufacturing of advanced electronic array packages such as ball grid array (BGAs) and chip scale packages (CSPs).

  14. Electron microscopy of the nuclear membrane of Amoeba proteus.

    PubMed

    FRAJOLA, W J; GREIDER, M H; KOSTIR, W J

    1956-07-25

    An electron microscope study of the nuclear membrane of Amoeba proteus by thin sectioning techniques has revealed an ultrastructure in the outer layer of the membrane that is homologous to the pores and annuli observed in the nuclear membranes of many other cell types studied by these techniques. An inner honeycombed layer apparently unique to Amoeba proteus is also described.

  15. Airglow studies using observations made with the GLO instrument on the Space Shuttle

    NASA Astrophysics Data System (ADS)

    Alfaro Suzan, Ana Luisa

    2009-12-01

    Our understanding of Earth's upper atmosphere has advanced tremendously over the last few decades due to our enhanced capacity for making remote observations from space. Space based observations of Earth's daytime and nighttime airglow emissions are very good examples of such enhancements to our knowledge. The terrestrial nighttime airglow, or nightglow, is barely discernible to the naked eye as viewed from Earth's surface. However, it is clearly visible from space - as most astronauts have been amazed to report. The nightglow consists of emissions of ultraviolet, visible and near-infrared radiation from electronically excited oxygen molecules and atoms and vibrationally excited OH molecules. It mostly emanates from a 10 km thick layer located about 100 km above Earth's surface. Various photochemical models have been proposed to explain the production of the emitting species. In this study some unique observations of Earth's nightglow made with the GLO instrument on NASA's Space Shuttle, are analyzed to assess the proposed excitation models. Previous analyses of these observations by Broadfoot and Gardner (2001), performed using a 1-D inversion technique, have indicated significant spatial structures and have raised serious questions about the proposed nightglow excitation models. However, the observation of such strong spatial structures calls into serious question the appropriateness of the adopted 1-D inversion technique and, therefore, the validity of the conclusions. In this study a more rigorous 2-D tomographic inversion technique is developed and applied to the available GLO data to determine if some of the apparent discrepancies can be explained by the limitations of the previously applied 1-D inversion approach. The results of this study still reveal some potentially serious inadequacies in the proposed photochemical models. However, alternative explanations for the discrepancies between the GLO observations and the model expectations are suggested. These include upper atmospheric tidal effects and possible errors in the pointing of the GLO instrument.

  16. Comparison of Multivariate Spatial Dependence Structures of DPIL and Flowmeter Hydraulic Conductivity Data Sets at the MADE Site

    NASA Astrophysics Data System (ADS)

    Xiao, B.; Haslauer, C. P.; Bohling, G. C.; Bárdossy, A.

    2017-12-01

    The spatial arrangement of hydraulic conductivity (K) determines water flow and solute transport behaviour in groundwater systems. This presentation demonstrates three advances over commonly used geostatistical methods by integrating measurements from novel measurement techniques and novel multivariate non-Gaussian dependence models: The spatial dependence structure of K was analysed using both data sets of K. Previously encountered similarities were confirmed in low-dimensional dependence. These similarities become less stringent and deviate more from symmetric Gaussian dependence in dimensions larger than two. Measurements of small and large K values are more uncertain than medium K values due to decreased sensitivity of the measurement devices at both ends of the K scale. Nevertheless, these measurements contain useful information that we include in the estimation of the marginal distribution and the spatial dependence structure as ``censored measurements'' that are estimated jointly without the common assumption of independence. The spatial dependence structure of the two data sets and their cross-covariances are used to infer the spatial dependence and the amount of the bias between the two data sets. By doing so, one spatial model for K is constructed that is used for simulation and that reflects the characteristics of both measurement techniques. The concept of the presented methodology is to use all available information for the estimation of a stochastic model of the primary parameter (K) at the highly heterogeneous Macrodispersion Experiment (MADE) site. The primary parameter has been measured by two independent measurement techniques whose sets of locations do not overlap. This site offers the unique opportunity of large quantities of measurements of K (31123 direct push injection logging based measurements and 2611 flowmeter based measurements). This improved dependence structure of K will be included into the estimated non-Gaussian dependence models and is expected to reproduce observed solute concentrations at the site better than existing dependence models of K.

  17. NASA Technical Management Report (533Q)

    NASA Technical Reports Server (NTRS)

    Klosko, S. M.; Sanchez, B. (Technical Monitor)

    2001-01-01

    The objective of this task is analytical support of the NASA Satellite Laser Ranging (SLR) program in the areas of SLR data analysis, software development, assessment of SLR station performance, development of improved models for atmospheric propagation and interpretation of station calibration techniques, and science coordination and analysis functions for the NASA led Central Bureau of the International Laser Ranging Service (ILRS). The contractor shall in each year of the five year contract: (1) Provide software development and analysis support to the NASA SLR program and the ILRS. Attend and make analysis reports at the monthly meetings of the Central Bureau of the ILRS covering data received during the previous period. Provide support to the Analysis Working Group of the ILRS including special tiger teams that are established to handle unique analysis problems. Support the updating of the SLR Bibliography contained on the ILRS web site; (2) Perform special assessments of SLR station performance from available data to determine unique biases and technical problems at the station; (3) Develop improvements to models of atmospheric propagation and for handling pre- and post-pass calibration data provided by global network stations; (4) Provide review presentation of overall ILRS network data results at one major scientific meeting per year; (5) Contribute to and support the publication of NASA SLR and ILRS reports highlighting the results of SLR analysis activity.

  18. A new approach to aid the characterisation and identification of metabolites of a model drug; partial isotope enrichment combined with novel formula elucidation software.

    PubMed

    Hobby, Kirsten; Gallagher, Richard T; Caldwell, Patrick; Wilson, Ian D

    2009-01-01

    This work describes the identification of 'isotopically enriched' metabolites of 4-cyanoaniline using the unique features of the software package 'Spectral Simplicity'. The software is capable of creating the theoretical mass spectra for partially isotope-enriched compounds, and subsequently performing an elemental composition analysis to give the elemental formula for the 'isotopically enriched' metabolite. A novel mass spectral correlation method, called 'FuzzyFit', was employed. 'FuzzyFit' utilises the expected experimental distribution of errors in both mass accuracy and isotope pattern and enables discrimination between statistically probable and improbable candidate formulae. The software correctly determined the molecular formulae of ten previously described metabolites of 4-cyanoaniline confirming the technique of partial isotope enrichment can produce results analogous to standard methodologies. Six previously unknown species were also identified, based on the presence of the unique 'designer' isotope ratio. Three of the unknowns were tentatively identified as N-acetylglutamine, O-methyl-N acetylglucuronide and a putative fatty acid conjugate. The discovery of a significant number of unknown species of a model drug with a comprehensive history of investigation highlights the potential for enhancement to the analytical process by the use of 'designer' isotope ratio compounds. The 'FuzzyFit' methodology significantly aided the elucidation of candidate formulae, by provision of a vastly simplified candidate formula data set. Copyright (c) 2008 John Wiley & Sons, Ltd.

  19. STOL and STOVL hot gas ingestion and airframe heating tests in the NASA Lewis 9- by 15-foot low-speed wind tunnel

    NASA Technical Reports Server (NTRS)

    Johns, Albert L.

    1989-01-01

    Short takeoff and landing (STOL) and advanced short takeoff and vertical landing (STOVL) aircraft are being pursued for deployment near the end of this century. These concepts offer unique capabilities not seen in conventional aircraft: for example, shorter takeoff distances and the ability to operate from damaged runways and remote sites. However, special technology is critical to the development of this unique class of aircraft. Some of the real issues that are associated with these concepts are hot gas ingestion and airframe heating while in ground effects. Over the past nine years, NASA Lewis Research Center has been involved in several cooperative programs in the 9- by 15 Foot Low-Speed Wind Tunnel (LSWT) to establish a database for hot gas ingestion and airframe heating. The modifications are presented that were made in the 9- by 15-Foot LSWT, including the evolution of the ground plane, model support system, and tunnel sidewalls; and flow visualization techniques, instrumentation, test procedures, and test results. The 9- by 15-Foot LSWT tests were conducted at full scale exhaust nozzle pressure ratios. The headwind velocities varied from 8 to 120 kn depending on the concept (STOL or STOVL). Typical compressor-face distortions (pressure and temperature), ground plane contours, and model surface temperature profiles are presented.

  20. Biomaterial Substrate-Mediated Multicellular Spheroid Formation and Their Applications in Tissue Engineering.

    PubMed

    Tseng, Ting-Chen; Wong, Chui-Wei; Hsieh, Fu-Yu; Hsu, Shan-Hui

    2017-12-01

    Three-dimentional (3D) multicellular aggregates (spheroids), compared to the traditional 2D monolayer cultured cells, are physiologically more similar to the cells in vivo. So far there are various techniques to generate 3D spheroids. Spheroids obtained from different methods have already been applied to regenerative medicine or cancer research. Among the cell spheroids created by different methods, the substrate-derived spheroids and their forming mechanism are unique. This review focuses on the formation of biomaterial substrate-mediated multicellular spheroids and their applications in tissue engineering and tumor models. First, the authors will describe the special chitosan substrate-derived mesenchymal stem cell (MSC) spheroids and their greater regenerative capacities in various tissues. Second, the authors will describe tumor spheroids derived on chitosan and hyaluronan substrates, which serve as a simple in vitro platform to study 3D tumor models or to perform cancer drug screening. Finally, the authors will mention the self-assembly process for substrate-derived multiple cell spheroids (co-spheroids), which may recapitulate the heterotypic cell-cell interaction for co-cultured cells or crosstalk between different types of cells. These unique multicellular mono-spheroids or co-spheroids represent a category of 3D cell culture with advantages of biomimetic cell-cell interaction, better functionalities, and imaging possibilities. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Top