A statistical mechanical approach to restricted integer partition functions
NASA Astrophysics Data System (ADS)
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-05-01
The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.
Exploiting the functional and taxonomic structure of genomic data by probabilistic topic modeling.
Chen, Xin; Hu, Xiaohua; Lim, Tze Y; Shen, Xiajiong; Park, E K; Rosen, Gail L
2012-01-01
In this paper, we present a method that enable both homology-based approach and composition-based approach to further study the functional core (i.e., microbial core and gene core, correspondingly). In the proposed method, the identification of major functionality groups is achieved by generative topic modeling, which is able to extract useful information from unlabeled data. We first show that generative topic model can be used to model the taxon abundance information obtained by homology-based approach and study the microbial core. The model considers each sample as a “document,” which has a mixture of functional groups, while each functional group (also known as a “latent topic”) is a weight mixture of species. Therefore, estimating the generative topic model for taxon abundance data will uncover the distribution over latent functions (latent topic) in each sample. Second, we show that, generative topic model can also be used to study the genome-level composition of “N-mer” features (DNA subreads obtained by composition-based approaches). The model consider each genome as a mixture of latten genetic patterns (latent topics), while each functional pattern is a weighted mixture of the “N-mer” features, thus the existence of core genomes can be indicated by a set of common N-mer features. After studying the mutual information between latent topics and gene regions, we provide an explanation of the functional roles of uncovered latten genetic patterns. The experimental results demonstrate the effectiveness of proposed method.
Optical Spatial integration methods for ambiguity function generation
NASA Technical Reports Server (NTRS)
Tamura, P. N.; Rebholz, J. J.; Daehlin, O. T.; Lee, T. C.
1981-01-01
A coherent optical spatial integration approach to ambiguity function generation is described. It uses one dimensional acousto-optic Bragg cells as input tranducers in conjunction with a space variant linear phase shifter, a passive optical element, to generate the two dimensional ambiguity function in one exposure. Results of a real time implementation of this system are shown.
Precision medicine for cancer with next-generation functional diagnostics.
Friedman, Adam A; Letai, Anthony; Fisher, David E; Flaherty, Keith T
2015-12-01
Precision medicine is about matching the right drugs to the right patients. Although this approach is technology agnostic, in cancer there is a tendency to make precision medicine synonymous with genomics. However, genome-based cancer therapeutic matching is limited by incomplete biological understanding of the relationship between phenotype and cancer genotype. This limitation can be addressed by functional testing of live patient tumour cells exposed to potential therapies. Recently, several 'next-generation' functional diagnostic technologies have been reported, including novel methods for tumour manipulation, molecularly precise assays of tumour responses and device-based in situ approaches; these address the limitations of the older generation of chemosensitivity tests. The promise of these new technologies suggests a future diagnostic strategy that integrates functional testing with next-generation sequencing and immunoprofiling to precisely match combination therapies to individual cancer patients.
NASA Technical Reports Server (NTRS)
Chase, W. D.
1976-01-01
The use of blue and red color in out-of-window cockpit displays, in full-spectrum calligraphic computer-generated display systems, is studied with attention given to pilot stereographic depth perception and response to visual cues. Displays for vertical approach, with dynamic and frozen-range landing approach and perspective arrays, are analyzed. Pilot transfer function and the transfer function associated with the contrasted approach and perspective arrays are discussed. Out-of-window blue lights are perceived by pilots as indicating greater distance depth, red lights as indicating proximity. The computer-generated chromatic display was adapted to flight simulators for the tests.
Modeling thrombin generation: plasma composition based approach.
Brummel-Ziedins, Kathleen E; Everse, Stephen J; Mann, Kenneth G; Orfeo, Thomas
2014-01-01
Thrombin has multiple functions in blood coagulation and its regulation is central to maintaining the balance between hemorrhage and thrombosis. Empirical and computational methods that capture thrombin generation can provide advancements to current clinical screening of the hemostatic balance at the level of the individual. In any individual, procoagulant and anticoagulant factor levels together act to generate a unique coagulation phenotype (net balance) that is reflective of the sum of its developmental, environmental, genetic, nutritional and pharmacological influences. Defining such thrombin phenotypes may provide a means to track disease progression pre-crisis. In this review we briefly describe thrombin function, methods for assessing thrombin dynamics as a phenotypic marker, computationally derived thrombin phenotypes versus determined clinical phenotypes, the boundaries of normal range thrombin generation using plasma composition based approaches and the feasibility of these approaches for predicting risk.
Witt, Elke
2008-12-01
The question, how organisms obtain their specific complex and functional forms, was widely discussed during the eighteenth century. The theory of preformation, which was the dominant theory of generation, was challenged by different alternative epigenetic theories. By the end of the century it was the vitalist approach most famously advocated by Johann Friedrich Blumenbach that prevailed. Yet the alternative theory of generation brought forward by Caspar Friedrich Wolff was an important contribution to the treatment of this question. He turned his attention from the properties of matter and the forces acting on it towards the level of the processes of generation in order to explain the constitution of organismic forms. By regarding organic structures and forms to be the result of the lawfulness of ongoing processes, he opened up the possibility of a functional but non-teleological explanation of generation, and thereby provided an important complement to materialist and vitalist approaches.
Computerized Design and Generation of Low-noise Helical Gears with Modified Surface Topology
NASA Technical Reports Server (NTRS)
Litvin, F. L.; Chen, N. X.; Lu, J.; Handschuh, R. F.
1994-01-01
An approach for design and generation of low-noise helical gears with localized bearing contact is proposed. The approach is applied to double circular arc helical gears and modified involute helical gears. The reduction of noise and vibration is achieved by application of a predesigned parabolic function of transmission errors that is able to absorb a discontinuous linear function of transmission errors caused by misalignment. The localization of the bearing contact is achieved by the mismatch of pinion-gear tooth surfaces. Computerized simulation of meshing and contact of the designed gears demonstrated that the proposed approach will produce a pair of gears that has a parabolic transmission error function even when misalignment is present. Numerical examples for illustration of the developed approach are given.
Wang, J; Hao, Z; Wang, H
2018-01-01
The human brain can be characterized as functional networks. Therefore, it is important to subdivide the brain appropriately in order to construct reliable networks. Resting-state functional connectivity-based parcellation is a commonly used technique to fulfill this goal. Here we propose a novel individual subject-level parcellation approach based on whole-brain resting-state functional magnetic resonance imaging (fMRI) data. We first used a supervoxel method known as simple linear iterative clustering directly on resting-state fMRI time series to generate supervoxels, and then combined similar supervoxels to generate clusters using a clustering method known as graph-without-cut (GWC). The GWC approach incorporates spatial information and multiple features of the supervoxels by energy minimization, simultaneously yielding an optimal graph and brain parcellation. Meanwhile, it theoretically guarantees that the actual cluster number is exactly equal to the initialized cluster number. By comparing the results of the GWC approach and those of the random GWC approach, we demonstrated that GWC does not rely heavily on spatial structures, thus avoiding the challenges encountered in some previous whole-brain parcellation approaches. In addition, by comparing the GWC approach to two competing approaches, we showed that GWC achieved better parcellation performances in terms of different evaluation metrics. The proposed approach can be used to generate individualized brain atlases for applications related to cognition, development, aging, disease, personalized medicine, etc. The major source codes of this study have been made publicly available at https://github.com/yuzhounh/GWC.
Application of Lagrangian blending functions for grid generation around airplane geometries
NASA Technical Reports Server (NTRS)
Abolhassani, Jamshid S.; Sadrehaghighi, Ideen; Tiwari, Surendra N.
1990-01-01
A simple procedure was developed and applied for the grid generation around an airplane geometry. This approach is based on a transfinite interpolation with Lagrangian interpolation for the blending functions. A monotonic rational quadratic spline interpolation was employed for the grid distributions.
Teotia, Pooja; Chopra, Divyan A; Dravid, Shashank Manohar; Van Hook, Matthew J; Qiu, Fang; Morrison, John; Rizzino, Angie; Ahmad, Iqbal
2017-03-01
Glaucoma is a complex group of diseases wherein a selective degeneration of retinal ganglion cells (RGCs) lead to irreversible loss of vision. A comprehensive approach to glaucomatous RGC degeneration may include stem cells to functionally replace dead neurons through transplantation and understand RGCs vulnerability using a disease in a dish stem cell model. Both approaches require the directed generation of stable, functional, and target-specific RGCs from renewable sources of cells, that is, the embryonic stem cells and induced pluripotent stem cells. Here, we demonstrate a rapid and safe, stage-specific, chemically defined protocol that selectively generates RGCs across species, including human, by recapitulating the developmental mechanism. The de novo generated RGCs from pluripotent cells are similar to native RGCs at the molecular, biochemical, functional levels. They also express axon guidance molecules, and discriminate between specific and nonspecific targets, and are nontumorigenic. Stem Cells 2017;35:572-585. © 2016 AlphaMed Press.
USDA-ARS?s Scientific Manuscript database
Data assimilation and regression are two commonly used methods for predicting agricultural yield from remote sensing observations. Data assimilation is a generative approach because it requires explicit approximations of the Bayesian prior and likelihood to compute the probability density function...
Applications of Lagrangian blending functions for grid generation around airplane geometries
NASA Technical Reports Server (NTRS)
Abolhassani, Jamshid S.; Sadrehaghighi, Ideen; Tiwari, Surendra N.; Smith, Robert E.
1990-01-01
A simple procedure has been developed and applied for the grid generation around an airplane geometry. This approach is based on a transfinite interpolation with Lagrangian interpolation for the blending functions. A monotonic rational quadratic spline interpolation has been employed for the grid distributions.
Generating multi-double-scroll attractors via nonautonomous approach.
Hong, Qinghui; Xie, Qingguo; Shen, Yi; Wang, Xiaoping
2016-08-01
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify the availability and feasibility of this method.
Generation of genetically modified mice using CRISPR/Cas9 and haploid embryonic stem cell systems
JIN, Li-Fang; LI, Jin-Song
2016-01-01
With the development of high-throughput sequencing technology in the post-genomic era, researchers have concentrated their efforts on elucidating the relationships between genes and their corresponding functions. Recently, important progress has been achieved in the generation of genetically modified mice based on CRISPR/Cas9 and haploid embryonic stem cell (haESC) approaches, which provide new platforms for gene function analysis, human disease modeling, and gene therapy. Here, we review the CRISPR/Cas9 and haESC technology for the generation of genetically modified mice and discuss the key challenges in the application of these approaches. PMID:27469251
Trajectory phase transitions and dynamical Lee-Yang zeros of the Glauber-Ising chain.
Hickey, James M; Flindt, Christian; Garrahan, Juan P
2013-07-01
We examine the generating function of the time-integrated energy for the one-dimensional Glauber-Ising model. At long times, the generating function takes on a large-deviation form and the associated cumulant generating function has singularities corresponding to continuous trajectory (or "space-time") phase transitions between paramagnetic trajectories and ferromagnetically or antiferromagnetically ordered trajectories. In the thermodynamic limit, the singularities make up a whole curve of critical points in the complex plane of the counting field. We evaluate analytically the generating function by mapping the generator of the biased dynamics to a non-Hermitian Hamiltonian of an associated quantum spin chain. We relate the trajectory phase transitions to the high-order cumulants of the time-integrated energy which we use to extract the dynamical Lee-Yang zeros of the generating function. This approach offers the possibility to detect continuous trajectory phase transitions from the finite-time behavior of measurable quantities.
Pervasive Transcription of a Herpesvirus Genome Generates Functionally Important RNAs
Canny, Susan P.; Reese, Tiffany A.; Johnson, L. Steven; Zhang, Xin; Kambal, Amal; Duan, Erning; Liu, Catherine Y.; Virgin, Herbert W.
2014-01-01
ABSTRACT Pervasive transcription is observed in a wide range of organisms, including humans, mice, and viruses, but the functional significance of the resulting transcripts remains uncertain. Current genetic approaches are often limited by their emphasis on protein-coding open reading frames (ORFs). We previously identified extensive pervasive transcription from the murine gammaherpesvirus 68 (MHV68) genome outside known ORFs and antisense to known genes (termed expressed genomic regions [EGRs]). Similar antisense transcripts have been identified in many other herpesviruses, including Kaposi’s sarcoma-associated herpesvirus and human and murine cytomegalovirus. Despite their prevalence, whether these RNAs have any functional importance in the viral life cycle is unknown, and one interpretation is that these are merely “noise” generated by functionally unimportant transcriptional events. To determine whether pervasive transcription of a herpesvirus genome generates RNA molecules that are functionally important, we used a strand-specific functional approach to target transcripts from thirteen EGRs in MHV68. We found that targeting transcripts from six EGRs reduced viral protein expression, proving that pervasive transcription can generate functionally important RNAs. We characterized transcripts emanating from EGRs 26 and 27 in detail using several methods, including RNA sequencing, and identified several novel polyadenylated transcripts that were enriched in the nuclei of infected cells. These data provide the first evidence of the functional importance of regions of pervasive transcription emanating from MHV68 EGRs. Therefore, studies utilizing mutation of a herpesvirus genome must account for possible effects on RNAs generated by pervasive transcription. PMID:24618256
SCOS 2: ESA's new generation of mission control system
NASA Technical Reports Server (NTRS)
Jones, M.; Head, N. C.; Keyte, K.; Howard, P.; Lynenskjold, S.
1994-01-01
New mission-control infrastructure is currently being developed by ESOC, which will constitute the second generation of the Spacecraft Control Operations system (SCOS 2). The financial, functional and strategic requirements lying behind the new development are explained. The SCOS 2 approach is described. The technological implications of these approaches is described: in particular it is explained how this leads to the use of object oriented techniques to provide the required 'building block' approach. The paper summarizes the way in which the financial, functional and strategic requirements have been met through this combination of solutions. Finally, the paper outlines the development process to date, noting how risk reduction was achieved in the approach to new technologies and summarizes the current status future plans.
NASA Astrophysics Data System (ADS)
Nie, Xiaokai; Coca, Daniel
2018-01-01
The paper introduces a matrix-based approach to estimate the unique one-dimensional discrete-time dynamical system that generated a given sequence of probability density functions whilst subjected to an additive stochastic perturbation with known density.
Nie, Xiaokai; Coca, Daniel
2018-01-01
The paper introduces a matrix-based approach to estimate the unique one-dimensional discrete-time dynamical system that generated a given sequence of probability density functions whilst subjected to an additive stochastic perturbation with known density.
Generating multi-double-scroll attractors via nonautonomous approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Qinghui; Xie, Qingguo, E-mail: qgxie@mail.hust.edu.cn; Shen, Yi
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify themore » availability and feasibility of this method.« less
Development of numerical model for predicting heat generation and temperatures in MSW landfills.
Hanson, James L; Yeşiller, Nazli; Onnen, Michael T; Liu, Wei-Lien; Oettle, Nicolas K; Marinos, Janelle A
2013-10-01
A numerical modeling approach has been developed for predicting temperatures in municipal solid waste landfills. Model formulation and details of boundary conditions are described. Model performance was evaluated using field data from a landfill in Michigan, USA. The numerical approach was based on finite element analysis incorporating transient conductive heat transfer. Heat generation functions representing decomposition of wastes were empirically developed and incorporated to the formulation. Thermal properties of materials were determined using experimental testing, field observations, and data reported in literature. The boundary conditions consisted of seasonal temperature cycles at the ground surface and constant temperatures at the far-field boundary. Heat generation functions were developed sequentially using varying degrees of conceptual complexity in modeling. First a step-function was developed to represent initial (aerobic) and residual (anaerobic) conditions. Second, an exponential growth-decay function was established. Third, the function was scaled for temperature dependency. Finally, an energy-expended function was developed to simulate heat generation with waste age as a function of temperature. Results are presented and compared to field data for the temperature-dependent growth-decay functions. The formulations developed can be used for prediction of temperatures within various components of landfill systems (liner, waste mass, cover, and surrounding subgrade), determination of frost depths, and determination of heat gain due to decomposition of wastes. Copyright © 2013 Elsevier Ltd. All rights reserved.
1988-05-25
theoretical approaches used in developing the proqrams. The introduction of the report (Section 1) gives general background of the concepts and... GENERATION 1-5 1.3 WORKPLACE DESIGN 1-6 1.4 THE CREW CHIEF MAINTENANCE ANALYSIS PROGRAMS 1-7 1.5 GETTING STARTED 1-11 2 CREW CHIEF GENERATION FUNCTIONS...OPTIONS 8-1 9 QUICK REFERENCE 9-1 9.1 CREW CHIEF GENERATION FUNCTIONS (@CCGEN) 9-1 9.1.1 CREW CHIEF Initialization Function (CCINIT) 9-1 9.1.2 CREW CHIEF
Flavin, Kevin; Chaur, Manuel N; Echegoyen, Luis; Giordani, Silvia
2010-02-19
A novel versatile approach for the functionalization of multilayer fullerenes (carbon nano-onions) has been developed, which involves the facile introduction of a variety of simple functionalities onto their surface by treatment with in situ generated diazonium compounds. This approach is complemented by use of "click" chemistry which was used for the covalent introduction of more complex porphyrin molecules.
NASA Astrophysics Data System (ADS)
Peng, Yonggang; Xie, Shijie; Zheng, Yujun; Brown, Frank L. H.
2009-12-01
Generating function calculations are extended to allow for laser pulse envelopes of arbitrary shape in numerical applications. We investigate photon emission statistics for two-level and V- and Λ-type three-level systems under time-dependent excitation. Applications relevant to electromagnetically induced transparency and photon emission from single quantum dots are presented.
Generation of Functional Thyroid Tissue Using 3D-Based Culture of Embryonic Stem Cells.
Antonica, Francesco; Kasprzyk, Dominika Figini; Schiavo, Andrea Alex; Romitti, Mírian; Costagliola, Sabine
2017-01-01
During the last decade three-dimensional (3D) cultures of pluripotent stem cells have been intensively used to understand morphogenesis and molecular signaling important for the embryonic development of many tissues. In addition, pluripotent stem cells have been shown to be a valid tool for the in vitro modeling of several congenital or chronic human diseases, opening new possibilities to study their physiopathology without using animal models. Even more interestingly, 3D culture has proved to be a powerful and versatile tool to successfully generate functional tissues ex vivo. Using similar approaches, we here describe a protocol for the generation of functional thyroid tissue using mouse embryonic stem cells and give all the details and references for its characterization and analysis both in vitro and in vivo. This model is a valid approach to study the expression and the function of genes involved in the correct morphogenesis of thyroid gland, to elucidate the mechanisms of production and secretion of thyroid hormones and to test anti-thyroid drugs.
Mobile Manipulators for Assisted Living in Residential Settings
2007-01-01
expected to grow dramatically over the next decade as the baby boom generation approaches 65 years of age. The UMass/Smith ASSIST framework aims to...is expected to grow dramatically over the next decade as the baby boom generation approaches 65 years of age. The UMass/Smith ASSIST framework aims to...functions. The vast majority of robotic rehabilitation work takes a hands-on approach where, for example, the robot aids the movement of a patient’s limb
ERIC Educational Resources Information Center
Lee, Kerry; Ng, Ee Lynn; Ng, Swee Fong
2009-01-01
Solving algebraic word problems involves multiple cognitive phases. The authors used a multitask approach to examine the extent to which working memory and executive functioning are associated with generating problem models and producing solutions. They tested 255 11-year-olds on working memory (Counting Recall, Letter Memory, and Keep Track),…
Green’s functions for a volume source in an elastic half-space
Zabolotskaya, Evgenia A.; Ilinskii, Yurii A.; Hay, Todd A.; Hamilton, Mark F.
2012-01-01
Green’s functions are derived for elastic waves generated by a volume source in a homogeneous isotropic half-space. The context is sources at shallow burial depths, for which surface (Rayleigh) and bulk waves, both longitudinal and transverse, can be generated with comparable magnitudes. Two approaches are followed. First, the Green’s function is expanded with respect to eigenmodes that correspond to Rayleigh waves. While bulk waves are thus ignored, this approximation is valid on the surface far from the source, where the Rayleigh wave modes dominate. The second approach employs an angular spectrum that accounts for the bulk waves and yields a solution that may be separated into two terms. One is associated with bulk waves, the other with Rayleigh waves. The latter is proved to be identical to the Green’s function obtained following the first approach. The Green’s function obtained via angular spectrum decomposition is analyzed numerically in the time domain for different burial depths and distances to the receiver, and for parameters relevant to seismo-acoustic detection of land mines and other buried objects. PMID:22423682
Blastocyst complementation generates exogenic pancreas in vivo in apancreatic cloned pigs
Matsunari, Hitomi; Nagashima, Hiroshi; Watanabe, Masahito; Umeyama, Kazuhiro; Nakano, Kazuaki; Nagaya, Masaki; Kobayashi, Toshihiro; Yamaguchi, Tomoyuki; Sumazaki, Ryo; Herzenberg, Leonard A.; Nakauchi, Hiromitsu
2013-01-01
In the field of regenerative medicine, one of the ultimate goals is to generate functioning organs from pluripotent cells, such as ES cells or induced pluripotent stem cells (PSCs). We have recently generated functional pancreas and kidney from PSCs in pancreatogenesis- or nephrogenesis-disabled mice, providing proof of principle for organogenesis from PSCs in an embryo unable to form a specific organ. Key when applying the principles of in vivo generation to human organs is compensation for an empty developmental niche in large nonrodent mammals. Here, we show that the blastocyst complementation system can be applied in the pig using somatic cell cloning technology. Transgenic approaches permitted generation of porcine somatic cell cloned embryos with an apancreatic phenotype. Complementation of these embryos with allogenic blastomeres then created functioning pancreata in the vacant niches. These results clearly indicate that a missing organ can be generated from exogenous cells when functionally normal pluripotent cells chimerize a cloned dysorganogenetic embryo. The feasibility of blastocyst complementation using cloned porcine embryos allows experimentation toward the in vivo generation of functional organs from xenogenic PSCs in large animals. PMID:23431169
Blastocyst complementation generates exogenic pancreas in vivo in apancreatic cloned pigs.
Matsunari, Hitomi; Nagashima, Hiroshi; Watanabe, Masahito; Umeyama, Kazuhiro; Nakano, Kazuaki; Nagaya, Masaki; Kobayashi, Toshihiro; Yamaguchi, Tomoyuki; Sumazaki, Ryo; Herzenberg, Leonard A; Nakauchi, Hiromitsu
2013-03-19
In the field of regenerative medicine, one of the ultimate goals is to generate functioning organs from pluripotent cells, such as ES cells or induced pluripotent stem cells (PSCs). We have recently generated functional pancreas and kidney from PSCs in pancreatogenesis- or nephrogenesis-disabled mice, providing proof of principle for organogenesis from PSCs in an embryo unable to form a specific organ. Key when applying the principles of in vivo generation to human organs is compensation for an empty developmental niche in large nonrodent mammals. Here, we show that the blastocyst complementation system can be applied in the pig using somatic cell cloning technology. Transgenic approaches permitted generation of porcine somatic cell cloned embryos with an apancreatic phenotype. Complementation of these embryos with allogenic blastomeres then created functioning pancreata in the vacant niches. These results clearly indicate that a missing organ can be generated from exogenous cells when functionally normal pluripotent cells chimerize a cloned dysorganogenetic embryo. The feasibility of blastocyst complementation using cloned porcine embryos allows experimentation toward the in vivo generation of functional organs from xenogenic PSCs in large animals.
Modeling Renewable Penertration Using a Network Economic Model
NASA Astrophysics Data System (ADS)
Lamont, A.
2001-03-01
This paper evaluates the accuracy of a network economic modeling approach in designing energy systems having renewable and conventional generators. The network approach models the system as a network of processes such as demands, generators, markets, and resources. The model reaches a solution by exchanging prices and quantity information between the nodes of the system. This formulation is very flexible and takes very little time to build and modify models. This paper reports an experiment designing a system with photovoltaic and base and peak fossil generators. The level of PV penetration as a function of its price and the capacities of the fossil generators were determined using the network approach and using an exact, analytic approach. It is found that the two methods agree very closely in terms of the optimal capacities and are nearly identical in terms of annual system costs.
NASA Astrophysics Data System (ADS)
Gaydecki, P.
2009-07-01
A system is described for the design, downloading and execution of arbitrary functions, intended for use with acoustic and low-frequency ultrasonic transducers in condition monitoring and materials testing applications. The instrumentation comprises a software design tool and a powerful real-time digital signal processor unit, operating at 580 million multiplication-accumulations per second (MMACs). The embedded firmware employs both an established look-up table approach and a new function interpolation technique to generate the real-time signals with very high precision and flexibility. Using total harmonic distortion (THD) analysis, the purity of the waveforms have been compared with those generated using traditional analogue function generators; this analysis has confirmed that the new instrument has a consistently superior signal-to-noise ratio.
System and method for key generation in security tokens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.
Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).
A Genetic Algorithm for the Generation of Packetization Masks for Robust Image Communication
Zapata-Quiñones, Katherine; Duran-Faundez, Cristian; Gutiérrez, Gilberto; Lecuire, Vincent; Arredondo-Flores, Christopher; Jara-Lipán, Hugo
2017-01-01
Image interleaving has proven to be an effective solution to provide the robustness of image communication systems when resource limitations make reliable protocols unsuitable (e.g., in wireless camera sensor networks); however, the search for optimal interleaving patterns is scarcely tackled in the literature. In 2008, Rombaut et al. presented an interesting approach introducing a packetization mask generator based in Simulated Annealing (SA), including a cost function, which allows assessing the suitability of a packetization pattern, avoiding extensive simulations. In this work, we present a complementary study about the non-trivial problem of generating optimal packetization patterns. We propose a genetic algorithm, as an alternative to the cited work, adopting the mentioned cost function, then comparing it to the SA approach and a torus automorphism interleaver. In addition, we engage the validation of the cost function and provide results attempting to conclude about its implication in the quality of reconstructed images. Several scenarios based on visual sensor networks applications were tested in a computer application. Results in terms of the selected cost function and image quality metric PSNR show that our algorithm presents similar results to the other approaches. Finally, we discuss the obtained results and comment about open research challenges. PMID:28452934
ERIC Educational Resources Information Center
Kleemann, Gary L.
2005-01-01
The author reviews the evolution of Web services--from information sharing to transactional to relationship building--and the progression from first-generation to fourth-generation Web sites. (Contains 3 figures.)
Automated Testcase Generation for Numerical Support Functions in Embedded Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Schnieder, Stefan-Alexander
2014-01-01
We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.
Functional connectomics from resting-state fMRI
Smith, Stephen M; Vidaurre, Diego; Beckmann, Christian F; Glasser, Matthew F; Jenkinson, Mark; Miller, Karla L; Nichols, Thomas E; Robinson, Emma; Salimi-Khorshidi, Gholamreza; Woolrich, Mark W; Barch, Deanna M; Uğurbil, Kamil; Van Essen, David C
2014-01-01
Spontaneous fluctuations in activity in different parts of the brain can be used to study functional brain networks. We review the use of resting-state functional MRI for the purpose of mapping the macroscopic functional connectome. After describing MRI acquisition and image processing methods commonly used to generate data in a form amenable to connectomics network analysis, we discuss different approaches for estimating network structure from that data. Finally, we describe new possibilities resulting from the high-quality rfMRI data being generated by the Human Connectome Project, and highlight some upcoming challenges in functional connectomics. PMID:24238796
Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad
2018-02-01
The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Amartya Sen's Capability Approach and Education
ERIC Educational Resources Information Center
Walker, Melanie
2005-01-01
The human capabilities approach developed by the economist Amartya Sen links development, quality of life and freedom. This article explores the key ideas in the capability approach of: capability, functioning, agency, human diversity and public participation in generating valued capabilities. It then considers how these ideas relate specifically…
NASA Astrophysics Data System (ADS)
Cavagnetto, Andy; Hand, Brian M.; Norton-Meier, Lori
2010-03-01
This case study aimed to determine the nature of student interactions in small groups in an elementary classroom utilizing the Science Writing Heuristic approach. Fifth grade students were audio-recorded over four units of study while working in small groups to generate knowledge claims after conducting student-directed investigations. Analysis consisted of (1) identifying amount of on/off task talk, (2) categorizing on-task talk as generative (talk associated with generating an argument) or representational (talk associated with representing an argument in a final written form), (3) characterizing the generative components of argument, and (4) determining the functions of language used. Results indicate that students were on task 98% of the time. Students engaged in generative talk an average of 25% of the time and representational talk an average of 71% of the time. Students engaged in components of Toulmin's model of argument, but challenging of each other's ideas was not commonplace. Talk was dominated by the informative function (representing one's ideas) of language as it was found 78.3% of the time and to a lesser extent (11.7%) the heuristic function (inquiring through questions). These functions appear to be intimately tied to the task of generating knowledge claims in small groups. The results suggest that both talking and writing are critical to using science discourse as an embedded strategy to learning science. Further, nature and structure of the task are important pedagogical considerations when moving students toward participation in science discourse.
Manipulating neural activity in physiologically classified neurons: triumphs and challenges
Gore, Felicity; Schwartz, Edmund C.; Salzman, C. Daniel
2015-01-01
Understanding brain function requires knowing both how neural activity encodes information and how this activity generates appropriate responses. Electrophysiological, imaging and immediate early gene immunostaining studies have been instrumental in identifying and characterizing neurons that respond to different sensory stimuli, events and motor actions. Here we highlight approaches that have manipulated the activity of physiologically classified neurons to determine their role in the generation of behavioural responses. Previous experiments have often exploited the functional architecture observed in many cortical areas, where clusters of neurons share response properties. However, many brain structures do not exhibit such functional architecture. Instead, neurons with different response properties are anatomically intermingled. Emerging genetic approaches have enabled the identification and manipulation of neurons that respond to specific stimuli despite the lack of discernable anatomical organization. These approaches have advanced understanding of the circuits mediating sensory perception, learning and memory, and the generation of behavioural responses by providing causal evidence linking neural response properties to appropriate behavioural output. However, significant challenges remain for understanding cognitive processes that are probably mediated by neurons with more complex physiological response properties. Currently available strategies may prove inadequate for determining how activity in these neurons is causally related to cognitive behaviour. PMID:26240431
Functional genomics platform for pooled screening and mammalian genetic interaction maps
Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.
2014-01-01
Systematic genetic interaction maps in microorganisms are powerful tools for identifying functional relationships between genes and defining the function of uncharacterized genes. We have recently implemented this strategy in mammalian cells as a two-stage approach. First, genes of interest are robustly identified in a pooled genome-wide screen using complex shRNA libraries. Second, phenotypes for all pairwise combinations of hit genes are measured in a double-shRNA screen and used to construct a genetic interaction map. Our protocol allows for rapid pooled screening under various conditions without a requirement for robotics, in contrast to arrayed approaches. Each stage of the protocol can be implemented in ~2 weeks, with additional time for analysis and generation of reagents. We discuss considerations for screen design, and present complete experimental procedures as well as a full computational analysis suite for identification of hits in pooled screens and generation of genetic interaction maps. While the protocols outlined here were developed for our original shRNA-based approach, they can be applied more generally, including to CRISPR-based approaches. PMID:24992097
Yang, Yanzheng; Zhu, Qiuan; Peng, Changhui; Wang, Han; Xue, Wei; Lin, Guanghui; Wen, Zhongming; Chang, Jie; Wang, Meng; Liu, Guobin; Li, Shiqing
2016-01-01
Increasing evidence indicates that current dynamic global vegetation models (DGVMs) have suffered from insufficient realism and are difficult to improve, particularly because they are built on plant functional type (PFT) schemes. Therefore, new approaches, such as plant trait-based methods, are urgently needed to replace PFT schemes when predicting the distribution of vegetation and investigating vegetation sensitivity. As an important direction towards constructing next-generation DGVMs based on plant functional traits, we propose a novel approach for modelling vegetation distributions and analysing vegetation sensitivity through trait-climate relationships in China. The results demonstrated that a Gaussian mixture model (GMM) trained with a LMA-Nmass-LAI data combination yielded an accuracy of 72.82% in simulating vegetation distribution, providing more detailed parameter information regarding community structures and ecosystem functions. The new approach also performed well in analyses of vegetation sensitivity to different climatic scenarios. Although the trait-climate relationship is not the only candidate useful for predicting vegetation distributions and analysing climatic sensitivity, it sheds new light on the development of next-generation trait-based DGVMs. PMID:27052108
Dickinson, Patsy S; Qu, Xuan; Stanhope, Meredith E
2016-12-01
Central pattern generators are subject to modulation by peptides, allowing for flexibility in patterned output. Current techniques used to characterize peptides include mass spectrometry and transcriptomics. In recent years, hundreds of neuropeptides have been sequenced from crustaceans; mass spectrometry has been used to identify peptides and to determine their levels and locations, setting the stage for comparative studies investigating the physiological roles of peptides. Such studies suggest that there is some evolutionary conservation of function, but also divergence of function even within a species. With current baseline data, it should be possible to begin using comparative approaches to ask fundamental questions about why peptides are encoded the way that they are and how this affects nervous system function. Copyright © 2016 Elsevier Ltd. All rights reserved.
Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.
Saller, Maximilian A C; Habershon, Scott
2017-07-11
Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.
Wang, Bing; Westerhoff, Lance M.; Merz, Kenneth M.
2008-01-01
We have generated docking poses for the FKBP-GPI complex using eight docking programs, and compared their scoring functions with scoring based on NMR chemical shift perturbations (NMRScore). Because the chemical shift perturbation (CSP) is exquisitely sensitive on the orientation of ligand inside the binding pocket, NMRScore offers an accurate and straightforward approach to score different poses. All scoring functions were inspected by their abilities to highly rank the native-like structures and separate them from decoy poses generated for a protein-ligand complex. The overall performance of NMRScore is much better than that of energy-based scoring functions associated with docking programs in both aspects. In summary, we find that the combination of docking programs with NMRScore results in an approach that can robustly determine the binding site structure for a protein-ligand complex, thereby, providing a new tool facilitating the structure-based drug discovery process. PMID:17867664
Initial Approaches for Discovery of Undocumented Functionality in FPGAs
2017-03-01
commercial pressures such as IP protection, support cost, and time to market , modern COTS devices contain many functions that are not exposed to the... market pressures have increased, industry increasingly uses the current generation device to do trial runs of next-generation architecture features...the product of industry operating in a highly cost competitive market , and are not inserted with malicious intent, however, this does not preclude
From Verified Models to Verifiable Code
NASA Technical Reports Server (NTRS)
Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.
Single-Layer Metasurface with Controllable Multiwavelength Functions.
Shi, Zhujun; Khorasaninejad, Mohammadreza; Huang, Yao-Wei; Roques-Carmes, Charles; Zhu, Alexander Y; Chen, Wei Ting; Sanjeev, Vyshakh; Ding, Zhao-Wei; Tamagnone, Michele; Chaudhary, Kundan; Devlin, Robert C; Qiu, Cheng-Wei; Capasso, Federico
2018-04-11
In this paper, we report dispersion-engineered metasurfaces with distinct functionalities controlled by wavelength. Unlike previous approaches based on spatial multiplexing or vertical stacking of metasurfaces, we utilize a single phase profile with wavelength dependence encoded in the phase shifters' dispersion. We designed and fabricated a multiwavelength achromatic metalens (MAM) with achromatic focusing for blue (B), green (G), yellow (Y), and red (R) light and two wavelength-controlled beam generators (WCBG): one focuses light with orbital angular momentum (OAM) states ( l = 0,1,2) corresponding to three primary colors; the other produces ordinary focal spots ( l = 0) for red and green light, while generating a vortex beam ( l = 1) in the blue. A full color (RGB) hologram is also demonstrated in simulation. Our approach opens a path to applications ranging from near-eye displays and holography to compact multiwavelength beam generation.
Functional Proteomics to Identify Moderators of CD8+ T Cell Function in Melanoma
2015-05-01
identified 17 phage that selectively bind TIL rather than effector cells. However, none of these phage influenced CD8+ TIL expansion or function in vitro...Using a novel NextGeneration sequencing approach, we have further defined another 1,000,000 phage that selectively bind TIL , of which 100,000 are unique...Using the original approach outlined in the application, we identified a total of 17 unique phage that selectively bind CD8+ TIL but not effector or
Random mutagenesis by error-prone pol plasmid replication in Escherichia coli.
Alexander, David L; Lilly, Joshua; Hernandez, Jaime; Romsdahl, Jillian; Troll, Christopher J; Camps, Manel
2014-01-01
Directed evolution is an approach that mimics natural evolution in the laboratory with the goal of modifying existing enzymatic activities or of generating new ones. The identification of mutants with desired properties involves the generation of genetic diversity coupled with a functional selection or screen. Genetic diversity can be generated using PCR or using in vivo methods such as chemical mutagenesis or error-prone replication of the desired sequence in a mutator strain. In vivo mutagenesis methods facilitate iterative selection because they do not require cloning, but generally produce a low mutation density with mutations not restricted to specific genes or areas within a gene. For this reason, this approach is typically used to generate new biochemical properties when large numbers of mutants can be screened or selected. Here we describe protocols for an advanced in vivo mutagenesis method that is based on error-prone replication of a ColE1 plasmid bearing the gene of interest. Compared to other in vivo mutagenesis methods, this plasmid-targeted approach allows increased mutation loads and facilitates iterative selection approaches. We also describe the mutation spectrum for this mutagenesis methodology in detail, and, using cycle 3 GFP as a target for mutagenesis, we illustrate the phenotypic diversity that can be generated using our method. In sum, error-prone Pol I replication is a mutagenesis method that is ideally suited for the evolution of new biochemical activities when a functional selection is available.
Functional test generation for digital circuits described with a declarative language: LUSTRE
NASA Astrophysics Data System (ADS)
Almahrous, Mazen
1990-08-01
A functional approach to the test generation problem starting from a high level description is proposed. The circuit tested is modeled, using the LUSTRE high level data flow description language. The different LUSTRE primitives are translated to a SATAN format graph in order to evaluate the testability of the circuit and to generate test sequences. Another method of testing the complex circuits comprising an operative part and a control part is defined. It consists of checking experiments for the control part observed through the operative part. It was applied to the automata generated from a LUSTRE description of the circuit.
NASA Astrophysics Data System (ADS)
Li, Y. B.; Yang, Z. X.; Chen, W.; He, Q. Y.
2017-11-01
The functional performance, such as magnetic flux leakage, power density and efficiency, is related to the structural characteristics and design technique for the disc permanent magnet synchronous generators (PMSGs). Halbach array theory-based magnetic circuit structure is developed, and Maxwell3D simulation analysis approach of PMSG is proposed in this paper for integrated starter generator (ISG). The magnetization direction of adjacent permanent magnet is organized in difference of 45 degrees for focusing air gap side, and improving the performance of the generator. The magnetic field distribution and functional performance in load and/or unload conditions are simulated by Maxwell3D module. The proposed approach is verified by simulation analysis, the air gap flux density is 0.66T, and the phase voltage curve has the characteristics of a preferable sinusoidal wave and the voltage amplitude 335V can meet the design requirements while the disc coreless PMSG is operating at rated speed. And the developed magnetic circuit structure can be used for engineering design of the disc coreless PMSG to the integrated starter generator.
Otey, Christopher R; Silberg, Jonathan J; Voigt, Christopher A; Endelman, Jeffrey B; Bandara, Geethani; Arnold, Frances H
2004-03-01
Recombination generates chimeric proteins whose ability to fold depends on minimizing structural perturbations that result when portions of the sequence are inherited from different parents. These chimeric sequences can display functional properties characteristic of the parents or acquire entirely new functions. Seventeen chimeras were generated from two CYP102 members of the functionally diverse cytochrome p450 family. Chimeras predicted to have limited structural disruption, as defined by the SCHEMA algorithm, displayed CO binding spectra characteristic of folded p450s. Even this small population exhibited significant functional diversity: chimeras displayed altered substrate specificities, a wide range in thermostabilities, up to a 40-fold increase in peroxidase activity, and ability to hydroxylate a substrate toward which neither parent heme domain shows detectable activity. These results suggest that SCHEMA-guided recombination can be used to generate diverse p450s for exploring function evolution within the p450 structural framework.
Next-Generation High-Throughput Functional Annotation of Microbial Genomes.
Baric, Ralph S; Crosson, Sean; Damania, Blossom; Miller, Samuel I; Rubin, Eric J
2016-10-04
Host infection by microbial pathogens cues global changes in microbial and host cell biology that facilitate microbial replication and disease. The complete maps of thousands of bacterial and viral genomes have recently been defined; however, the rate at which physiological or biochemical functions have been assigned to genes has greatly lagged. The National Institute of Allergy and Infectious Diseases (NIAID) addressed this gap by creating functional genomics centers dedicated to developing high-throughput approaches to assign gene function. These centers require broad-based and collaborative research programs to generate and integrate diverse data to achieve a comprehensive understanding of microbial pathogenesis. High-throughput functional genomics can lead to new therapeutics and better understanding of the next generation of emerging pathogens by rapidly defining new general mechanisms by which organisms cause disease and replicate in host tissues and by facilitating the rate at which functional data reach the scientific community. Copyright © 2016 Baric et al.
[Sex differentiation in plants. Terms and notions].
Godin, V N
2007-01-01
There are two methodological approaches to the study of sex in plants: the descriptive and morphological approach and the quantitative approach. The former is based exclusively on external morphological peculiarities of the generative organs of the flower, the latter is based on the functioning of individuals as parents of the coming generation. It has been suggested to recognize three flower types: staminate, pistillate, and complete. Depending on the distribution pattern of the flowers of different sex type, there are monomorphic populations (all individuals form flowers of the same type) and heteromorphic populations (individuals have flowers of different types). Monomorphic populations include monoclinous, monoecious, gynomonoecious, andromonoecious, and polygamomonoecious ones. Among heteromorphic populations, dioecious, polygamodioecious, subdioecious, paradioecious, and trioecious ones are recognized. It is desirable to give up the usage of such terms as "bisexual", "polygamous", "functionally female", and "functionally male" flowers, "temporary dioecy" and some others. The notion "gender" has been established in English-language works for describing the sex quantitavely; two additional terms have been proposed: "phenotypic gender" and "functional gender". The recently developed quantitative approach is at present in the process of accumulating material, and in need of the further elaborating the methodological base for research. Analysis of the principal notions shows the necessity to form their integrated structure and to correct the usage of the existing and new terms.
Govindan, Siva Shangari; Agamuthu, P
2014-10-01
Waste management can be regarded as a cross-cutting environmental 'mega-issue'. Sound waste management practices support the provision of basic needs for general health, such as clean air, clean water and safe supply of food. In addition, climate change mitigation efforts can be achieved through reduction of greenhouse gas emissions from waste management operations, such as landfills. Landfills generate landfill gas, especially methane, as a result of anaerobic degradation of the degradable components of municipal solid waste. Evaluating the mode of generation and collection of landfill gas has posted a challenge over time. Scientifically, landfill gas generation rates are presently estimated using numerical models. In this study the Intergovernmental Panel on Climate Change's Waste Model is used to estimate the methane generated from a Malaysian sanitary landfill. Key parameters of the model, which are the decay rate and degradable organic carbon, are analysed in two different approaches; the bulk waste approach and waste composition approach. The model is later validated using error function analysis and optimum decay rate, and degradable organic carbon for both approaches were also obtained. The best fitting values for the bulk waste approach are a decay rate of 0.08 y(-1) and degradable organic carbon value of 0.12; and for the waste composition approach the decay rate was found to be 0.09 y(-1) and degradable organic carbon value of 0.08. From this validation exercise, the estimated error was reduced by 81% and 69% for the bulk waste and waste composition approach, respectively. In conclusion, this type of modelling could constitute a sensible starting point for landfills to introduce careful planning for efficient gas recovery in individual landfills. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Rodriguez Marco, Albert
Battery management systems (BMS) require computationally simple but highly accurate models of the battery cells they are monitoring and controlling. Historically, empirical equivalent-circuit models have been used, but increasingly researchers are focusing their attention on physics-based models due to their greater predictive capabilities. These models are of high intrinsic computational complexity and so must undergo some kind of order-reduction process to make their use by a BMS feasible: we favor methods based on a transfer-function approach of battery cell dynamics. In prior works, transfer functions have been found from full-order PDE models via two simplifying assumptions: (1) a linearization assumption--which is a fundamental necessity in order to make transfer functions--and (2) an assumption made out of expedience that decouples the electrolyte-potential and electrolyte-concentration PDEs in order to render an approach to solve for the transfer functions from the PDEs. This dissertation improves the fidelity of physics-based models by eliminating the need for the second assumption and, by linearizing nonlinear dynamics around different constant currents. Electrochemical transfer functions are infinite-order and cannot be expressed as a ratio of polynomials in the Laplace variable s. Thus, for practical use, these systems need to be approximated using reduced-order models that capture the most significant dynamics. This dissertation improves the generation of physics-based reduced-order models by introducing different realization algorithms, which produce a low-order model from the infinite-order electrochemical transfer functions. Physics-based reduced-order models are linear and describe cell dynamics if operated near the setpoint at which they have been generated. Hence, multiple physics-based reduced-order models need to be generated at different setpoints (i.e., state-of-charge, temperature and C-rate) in order to extend the cell operating range. This dissertation improves the implementation of physics-based reduced-order models by introducing different blending approaches that combine the pre-computed models generated (offline) at different setpoints in order to produce good electrochemical estimates (online) along the cell state-of-charge, temperature and C-rate range.
Aircraft landing control system
NASA Technical Reports Server (NTRS)
Lambregts, Antonius A. (Inventor); Hansen, Rolf (Inventor)
1982-01-01
Upon aircraft landing approach, flare path command signals of altitude, vertical velocity and vertical acceleration are generated as functions of aircraft position and velocity with respect to the ground. The command signals are compared with corresponding actual values to generate error signals which are used to control the flight path.
NASA Technical Reports Server (NTRS)
kaul, Upender K.
2008-01-01
A procedure for generating smooth uniformly clustered single-zone grids using enhanced elliptic grid generation has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy. The procedure obviates the need for generating multizone grids for such geometries, as reported in the literature. This has been possible because the enhanced elliptic grid generator automatically generates clustered grids without manual prescription of decay parameters needed with the conventional approach. In fact, these decay parameters are calculated as decay functions as part of the solution, and they are not constant over a given boundary. Since these decay functions vary over a given boundary, orthogonal grids near any arbitrary boundary can be clustered automatically without having to break up the boundaries and the corresponding interior domains into various zones for grid generation.
Functional Analyses of the Problems in Non-English Majors' Writings
ERIC Educational Resources Information Center
Li, Shun-ying
2010-01-01
Problems in generating and organizing ideas, in coherence and language competence are common in non-English majors' writings, which decrease non-English majors' ability to use English as a tool to realize its pragmatic functions and meta-functions. The exam-centered objective, the product-oriented approach, the inefficient mode of instruction, the…
Functional Genomics Using the Saccharomyces cerevisiae Yeast Deletion Collections.
Nislow, Corey; Wong, Lai Hong; Lee, Amy Huei-Yi; Giaever, Guri
2016-09-01
Constructed by a consortium of 16 laboratories, the Saccharomyces genome-wide deletion collections have, for the past decade, provided a powerful, rapid, and inexpensive approach for functional profiling of the yeast genome. Loss-of-function deletion mutants were systematically created using a polymerase chain reaction (PCR)-based gene deletion strategy to generate a start-to-stop codon replacement of each open reading frame by homologous recombination. Each strain carries two molecular barcodes that serve as unique strain identifiers, enabling their growth to be analyzed in parallel and the fitness contribution of each gene to be quantitatively assessed by hybridization to high-density oligonucleotide arrays or through the use of next-generation sequencing technologies. Functional profiling of the deletion collections, using either strain-by-strain or parallel assays, provides an unbiased approach to systematically survey the yeast genome. The Saccharomyces yeast deletion collections have proved immensely powerful in contributing to the understanding of gene function, including functional relationships between genes and genetic pathways in response to diverse genetic and environmental perturbations. © 2016 Cold Spring Harbor Laboratory Press.
Computational functional genomics-based approaches in analgesic drug discovery and repurposing.
Lippmann, Catharina; Kringel, Dario; Ultsch, Alfred; Lötsch, Jörn
2018-06-01
Persistent pain is a major healthcare problem affecting a fifth of adults worldwide with still limited treatment options. The search for new analgesics increasingly includes the novel research area of functional genomics, which combines data derived from various processes related to DNA sequence, gene expression or protein function and uses advanced methods of data mining and knowledge discovery with the goal of understanding the relationship between the genome and the phenotype. Its use in drug discovery and repurposing for analgesic indications has so far been performed using knowledge discovery in gene function and drug target-related databases; next-generation sequencing; and functional proteomics-based approaches. Here, we discuss recent efforts in functional genomics-based approaches to analgesic drug discovery and repurposing and highlight the potential of computational functional genomics in this field including a demonstration of the workflow using a novel R library 'dbtORA'.
Automated Tutoring in Interactive Environments: A Task-Centered Approach.
ERIC Educational Resources Information Center
Wolz, Ursula; And Others
1989-01-01
Discusses tutoring and consulting functions in interactive computer environments. Tutoring strategies are considered, the expert model and the user model are described, and GENIE (Generated Informative Explanations)--an answer generating system for the Berkeley Unix Mail system--is explained as an example of an automated consulting system. (33…
Designing and Testing Functional RNA Nanoparticles | Center for Cancer Research
Recent advances in nanotechnology have generated excitement that nanomaterials may provide novel approaches for the diagnosis and treatment of deadly diseases, such as cancer. However, the use of synthetic materials to generate nanoparticles can present challenges with endotoxin content, sterility, or biocompatibility. Employing biological materials may overcome these issues
Broccoli, Vania; Rubio, Alicia; Taverna, Stefano; Yekhlef, Latefa
2015-06-01
The advent of cell reprogramming technologies has widely disclosed the possibility to have direct access to human neurons for experimental and biomedical applications. Human pluripotent stem cells can be instructed in vitro to generate specific neuronal cell types as well as different glial cells. Moreover, new approaches of direct neuronal cell reprogramming can strongly accelerate the generation of different neuronal lineages. However, genetic heterogeneity, reprogramming fidelity, and time in culture of the starting cells can still significantly bias their differentiation efficiency and quality of the neuronal progenies. In addition, reprogrammed human neurons exhibit a very slow pace in gaining a full spectrum of functional properties including physiological levels of membrane excitability, sustained and prolonged action potential firing, mature synaptic currents and synaptic plasticity. This delay poses serious limitations for their significance as biological experimental model and screening platform. We will discuss new approaches of neuronal cell differentiation and reprogramming as well as methods to accelerate the maturation and functional activity of the converted human neurons. © 2015 by the Society for Experimental Biology and Medicine.
Towards integrated hygiene and food safety management systems: the Hygieneomic approach.
Armstrong, G D
1999-09-15
Integrated hygiene and food safety management systems in food production can give rise to exceptional improvements in food safety performance, but require high level commitment and full functional involvement. A new approach, named hygieneomics, has been developed to assist management in their introduction of hygiene and food safety systems. For an effective introduction, the management systems must be designed to fit with the current generational state of an organisation. There are, broadly speaking, four generational states of an organisation in their approach to food safety. They comprise: (i) rules setting; (ii) ensuring compliance; (iii) individual commitment; (iv) interdependent action. In order to set up an effective integrated hygiene and food safety management system a number of key managerial requirements are necessary. The most important ones are: (a) management systems must integrate the activities of key functions from research and development through to supply chain and all functions need to be involved; (b) there is a critical role for the senior executive, in communicating policy and standards; (c) responsibilities must be clearly defined, and it should be clear that food safety is a line management responsibility not to be delegated to technical or quality personnel; (d) a thorough and effective multi-level audit approach is necessary; (e) key activities in the system are HACCP and risk management, but it is stressed that these are ongoing management activities, not once-off paper generating exercises; and (f) executive management board level review is necessary of audit results, measurements, status and business benefits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katanin, A. A., E-mail: katanin@mail.ru
We consider formulations of the functional renormalization-group (fRG) flow for correlated electronic systems with the dynamical mean-field theory as a starting point. We classify the corresponding renormalization-group schemes into those neglecting one-particle irreducible six-point vertices (with respect to the local Green’s functions) and neglecting one-particle reducible six-point vertices. The former class is represented by the recently introduced DMF{sup 2}RG approach [31], but also by the scale-dependent generalization of the one-particle irreducible representation (with respect to local Green’s functions, 1PI-LGF) of the generating functional [20]. The second class is represented by the fRG flow within the dual fermion approach [16, 32].more » We compare formulations of the fRG approach in each of these cases and suggest their further application to study 2D systems within the Hubbard model.« less
Saupe, Jörn; Kunz, Oliver; Haustedt, Lars Ole; Jakupovic, Sven; Mang, Christian
2017-09-04
Macrocycles are a structural class bearing great promise for future challenges in medicinal chemistry. Nevertheless, there are few flexible approaches for the rapid generation of structurally diverse macrocyclic compound collections. Here, an efficient method for the generation of novel macrocyclic peptide-based scaffolds is reported. The process, named here as "MacroEvoLution", is based on a cyclization screening approach that gives reliable access to novel macrocyclic architectures. Classification of building blocks into specific pools ensures that scaffolds with orthogonally addressable functionalities are generated, which can easily be used for the generation of structurally diverse compound libraries. The method grants rapid access to novel scaffolds with scalable synthesis (multi gram scale) and the introduction of further diversity at a late stage. Despite being developed for peptidic systems, the approach can easily be extended for the synthesis of systems with a decreased peptidic character. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Fermionic Approach to Weighted Hurwitz Numbers and Topological Recursion
NASA Astrophysics Data System (ADS)
Alexandrov, A.; Chapuy, G.; Eynard, B.; Harnad, J.
2017-12-01
A fermionic representation is given for all the quantities entering in the generating function approach to weighted Hurwitz numbers and topological recursion. This includes: KP and 2D Toda {τ} -functions of hypergeometric type, which serve as generating functions for weighted single and double Hurwitz numbers; the Baker function, which is expanded in an adapted basis obtained by applying the same dressing transformation to all vacuum basis elements; the multipair correlators and the multicurrent correlators. Multiplicative recursion relations and a linear differential system are deduced for the adapted bases and their duals, and a Christoffel-Darboux type formula is derived for the pair correlator. The quantum and classical spectral curves linking this theory with the topological recursion program are derived, as well as the generalized cut-and-join equations. The results are detailed for four special cases: the simple single and double Hurwitz numbers, the weakly monotone case, corresponding to signed enumeration of coverings, the strongly monotone case, corresponding to Belyi curves and the simplest version of quantum weighted Hurwitz numbers.
Fermionic Approach to Weighted Hurwitz Numbers and Topological Recursion
NASA Astrophysics Data System (ADS)
Alexandrov, A.; Chapuy, G.; Eynard, B.; Harnad, J.
2018-06-01
A fermionic representation is given for all the quantities entering in the generating function approach to weighted Hurwitz numbers and topological recursion. This includes: KP and 2 D Toda {τ} -functions of hypergeometric type, which serve as generating functions for weighted single and double Hurwitz numbers; the Baker function, which is expanded in an adapted basis obtained by applying the same dressing transformation to all vacuum basis elements; the multipair correlators and the multicurrent correlators. Multiplicative recursion relations and a linear differential system are deduced for the adapted bases and their duals, and a Christoffel-Darboux type formula is derived for the pair correlator. The quantum and classical spectral curves linking this theory with the topological recursion program are derived, as well as the generalized cut-and-join equations. The results are detailed for four special cases: the simple single and double Hurwitz numbers, the weakly monotone case, corresponding to signed enumeration of coverings, the strongly monotone case, corresponding to Belyi curves and the simplest version of quantum weighted Hurwitz numbers.
Minimalism in radiation synthesis of biomedical functional nanogels.
Dispenza, Clelia; Sabatino, Maria Antonietta; Grimaldi, Natascia; Bulone, Donatella; Bondì, Maria Luisa; Casaletto, Maria Pia; Rigogliuso, Salvatrice; Adamo, Giorgia; Ghersi, Giulio
2012-06-11
A scalable, single-step, synthetic approach for the manufacture of biocompatible, functionalized micro- and nanogels is presented. In particular, poly(N-vinyl pyrrolidone)-grafted-(aminopropyl)methacrylamide microgels and nanogels were generated through e-beam irradiation of PVP aqueous solutions in the presence of a primary amino-group-carrying monomer. Particles with different hydrodynamic diameters and surface charge densities were obtained at the variance of the irradiation conditions. Chemical structure was investigated by different spectroscopic techniques. Fluorescent variants were generated through fluorescein isothiocyanate attachment to the primary amino groups grafted to PVP, to both quantify the available functional groups for bioconjugation and follow nanogels localization in cell cultures. Finally, a model protein, bovine serum albumin, was conjugated to the nanogels to demonstrate the attachment of biologically relevant molecules for targeting purposes in drug delivery. The described approach provides a novel strategy to fabricate biohybrid nanogels with a very promising potential in nanomedicine.
Spinelli, Sherry L.; Lannan, Katie L.; Loelius, Shannon G.
2017-01-01
Abstract Human blood platelets are major hemostatic regulators in the circulation and important in the mediation of chronic inflammation and immunomodulation. They are key elements that promote cardiovascular pathogenesis that leads to atherosclerosis, thrombosis, myocardial infarction, and stroke. New information on tobacco use and platelet dysregulation shows that these highly understudied vascular cells are dysregulated by tobacco smoke. Thus, platelet function studies should be an important consideration for the evaluation of existing and next-generation tobacco and non-tobacco products. Novel in vitro approaches are being sought to investigate these products and their influence on platelet function. Platelets are ideally suited for product assessment, as robust and novel in vitro translational methods are available to assess platelet function. Furthermore, the use of human biological systems has the advantage that risk predictions will better reflect the human condition. PMID:28337466
Direct α-C-H bond functionalization of unprotected cyclic amines
NASA Astrophysics Data System (ADS)
Chen, Weijie; Ma, Longle; Paul, Anirudra; Seidel, Daniel
2018-02-01
Cyclic amines are ubiquitous core structures of bioactive natural products and pharmaceutical drugs. Although the site-selective abstraction of C-H bonds is an attractive strategy for preparing valuable functionalized amines from their readily available parent heterocycles, this approach has largely been limited to substrates that require protection of the amine nitrogen atom. In addition, most methods rely on transition metals and are incompatible with the presence of amine N-H bonds. Here we introduce a protecting-group-free approach for the α-functionalization of cyclic secondary amines. An operationally simple one-pot procedure generates products via a process that involves intermolecular hydride transfer to generate an imine intermediate that is subsequently captured by a nucleophile, such as an alkyl or aryl lithium compound. Reactions are regioselective and stereospecific and enable the rapid preparation of bioactive amines, as exemplified by the facile synthesis of anabasine and (-)-solenopsin A.
Generating Test Templates via Automated Theorem Proving
NASA Technical Reports Server (NTRS)
Kancherla, Mani Prasad
1997-01-01
Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.
Phosphorylation-dependent cleavage regulates von Hippel Lindau proteostasis and function
German, Peter; Bai, Shanshan; Liu, Xian-De; Sun, Mianen; Zhou, Lijun; Kalra, Sarathi; Zhang, Xuesong; Minelli, Rosalba; Scott, Kenneth L.; Mills, Gordon B.; Jonasch, Eric; Ding, Zhiyong
2016-01-01
Loss of von Hippel Lindau (VHL) protein function is a key driver of VHL diseases, including sporadic and inherited clear cell renal cell carcinoma. Modulation of the proteostasis of VHL, especially missense point-mutated VHL, is a promising approach to augmenting VHL levels and function. VHL proteostasis is regulated by multiple mechanisms including folding, chaperone binding, complex formation, and phosphorylation. Nevertheless, many details underlying the regulations of VHL proteostasis are unknown. VHL is expressed as two variants, VHL30 and VHL19. Furthermore, the long form variant of VHL was often detected as multiple bands by Western blotting. However, how these multiple species of VHL are generated and whether the process regulates VHL proteostasis and function are unknown. We hypothesized that the two major species are generated by VHL protein cleavage, and the cleavage regulates VHL proteostasis and subsequent function. We characterized VHL species using genetic and pharmacologic approaches and showed that VHL was first cleaved at the N-terminus by chymotrypsin C before being directed for proteasomal degradation. Casein kinase 2-mediated phosphorylation at VHL N-terminus was required for the cleavage. Furthermore, inhibition of cleavage stabilized VHL protein, thereby promoting HIF downregulation. Our study reveals a novel mechanism regulating VHL proteostasis and function, which is significant for identifying new drug targets and developing new therapeutic approaches targeting VHL deficiency in VHL diseases. PMID:26973240
A probabilistic approach to photovoltaic generator performance prediction
NASA Astrophysics Data System (ADS)
Khallat, M. A.; Rahman, S.
1986-09-01
A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.
A comparison of different methods to implement higher order derivatives of density functionals
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Dam, Hubertus J.J.
Density functional theory is the dominant approach in electronic structure methods today. To calculate properties higher order derivatives of the density functionals are required. These derivatives might be implemented manually,by automatic differentiation, or by symbolic algebra programs. Different authors have cited different reasons for using the particular method of their choice. This paper presents work where all three approaches were used and the strengths and weaknesses of each approach are considered. It is found that all three methods produce code that is suffficiently performanted for practical applications, despite the fact that our symbolic algebra generated code and our automatic differentiationmore » code still have scope for significant optimization. The automatic differentiation approach is the best option for producing readable and maintainable code.« less
A new approach for describing glass transition kinetics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasin, N. M.; Shchelkachev, M. G.; Vinokur, V. M.
2010-04-01
We use a functional integral technique generalizing the Keldysh diagram technique to describe glass transition kinetics. We show that the Keldysh functional approach takes the dynamical determinant arising in the glass dynamics into account exactly and generalizes the traditional approach based on using the supersymmetric dynamic generating functional method. In contrast to the supersymmetric method, this approach allows avoiding additional Grassmannian fields and tracking the violation of the fluctuation-dissipation theorem explicitly. We use this method to describe the dynamics of an Edwards-Anderson soft spin-glass-type model near the paramagnet-glass transition. We show that a Vogel-Fulcher-type dynamics arises in the fluctuation regionmore » only if the fluctuation-dissipation theorem is violated in the process of dynamical renormalization of the Keldysh action in the replica space.« less
A Parvovirus B19 synthetic genome: sequence features and functional competence.
Manaresi, Elisabetta; Conti, Ilaria; Bua, Gloria; Bonvicini, Francesca; Gallinella, Giorgio
2017-08-01
Central to genetic studies for Parvovirus B19 (B19V) is the availability of genomic clones that may possess functional competence and ability to generate infectious virus. In our study, we established a new model genetic system for Parvovirus B19. A synthetic approach was followed, by design of a reference genome sequence, by generation of a corresponding artificial construct and its molecular cloning in a complete and functional form, and by setup of an efficient strategy to generate infectious virus, via transfection in UT7/EpoS1 cells and amplification in erythroid progenitor cells. The synthetic genome was able to generate virus with biological properties paralleling those of native virus, its infectious activity being dependent on the preservation of self-complementarity and sequence heterogeneity within the terminal regions. A virus of defined genome sequence, obtained from controlled cell culture conditions, can constitute a reference tool for investigation of the structural and functional characteristics of the virus. Copyright © 2017 Elsevier Inc. All rights reserved.
Subject Expression in L2 Spanish: Convergence of Generative and Usage-Based Perspectives?
ERIC Educational Resources Information Center
Zyzik, Eve
2017-01-01
The extensive literature on subject expression in Spanish makes for rich comparisons between generative (formal) and usage-based (functional) approaches to language acquisition. This article explores how the problem of subject expression has been conceptualized within each research tradition, as well as unanswered questions that both approaches…
Nair, Smita K; Driscoll, Timothy; Boczkowski, David; Schmittling, Robert; Reynolds, Renee; Johnson, Laura A; Grant, Gerald; Fuchs, Herbert; Bigner, Darell D; Sampson, John H; Gururangan, Sridharan; Mitchell, Duane A
2015-10-01
Generation of patient-derived, autologous dendritic cells (DCs) is a critical component of cancer immunotherapy with ex vivo-generated, tumor antigen-loaded DCs. An important factor in the ability to generate DCs is the potential impact of prior therapies on DC phenotype and function. We investigated the ability to generate DCs using cells harvested from pediatric patients with medulloblastoma for potential evaluation of DC-RNA based vaccination approach in this patient population. Cells harvested from medulloblastoma patient leukapheresis following induction chemotherapy and granulocyte colony stimulating factor mobilization were cryopreserved prior to use in DC generation. DCs were generated from the adherent CD14+ monocytes using standard procedures and analyzed for cell recovery, phenotype and function. To summarize, 4 out of 5 patients (80%) had sufficient monocyte recovery to permit DC generation, and we were able to generate DCs from 3 out of these 4 patient samples (75%). Overall, we successfully generated DCs that met phenotypic requisites for DC-based cancer therapy from 3 out of 5 (60%) patient samples and met both phenotypic and functional requisites from 2 out of 5 (40%) patient samples. This study highlights the potential to generate functional DCs for further clinical treatments from refractory patients that have been heavily pretreated with myelosuppressive chemotherapy. Here we demonstrate the utility of evaluating the effect of the currently employed standard-of-care therapies on the ex vivo generation of DCs for DC-based clinical studies in cancer patients.
Conjugate gradient minimisation approach to generating holographic traps for ultracold atoms.
Harte, Tiffany; Bruce, Graham D; Keeling, Jonathan; Cassettari, Donatella
2014-11-03
Direct minimisation of a cost function can in principle provide a versatile and highly controllable route to computational hologram generation. Here we show that the careful design of cost functions, combined with numerically efficient conjugate gradient minimisation, establishes a practical method for the generation of holograms for a wide range of target light distributions. This results in a guided optimisation process, with a crucial advantage illustrated by the ability to circumvent optical vortex formation during hologram calculation. We demonstrate the implementation of the conjugate gradient method for both discrete and continuous intensity distributions and discuss its applicability to optical trapping of ultracold atoms.
Tissue Engineering of the Corneal Endothelium: A Review of Carrier Materials
Teichmann, Juliane; Valtink, Monika; Nitschke, Mirko; Gramm, Stefan; Funk, Richard H.W.; Engelmann, Katrin; Werner, Carsten
2013-01-01
Functional impairment of the human corneal endothelium can lead to corneal blindness. In order to meet the high demand for transplants with an appropriate human corneal endothelial cell density as a prerequisite for corneal function, several tissue engineering techniques have been developed to generate transplantable endothelial cell sheets. These approaches range from the use of natural membranes, biological polymers and biosynthetic material compositions, to completely synthetic materials as matrices for corneal endothelial cell sheet generation. This review gives an overview about currently used materials for the generation of transplantable corneal endothelial cell sheets with a special focus on thermo-responsive polymer coatings. PMID:24956190
NASA Astrophysics Data System (ADS)
Kidon, Lyran; Wilner, Eli Y.; Rabani, Eran
2015-12-01
The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima-Zwanzig-Mori time-convolution (TC) and the other on the Tokuyama-Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called "memory kernel" or "generator," going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operator in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green's function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.
Gonzalez-Vargas, Jose; Dosen, Strahinja; Amsuess, Sebastian; Yu, Wenwei; Farina, Dario
2015-01-01
Modern assistive devices are very sophisticated systems with multiple degrees of freedom. However, an effective and user-friendly control of these systems is still an open problem since conventional human-machine interfaces (HMI) cannot easily accommodate the system’s complexity. In HMIs, the user is responsible for generating unique patterns of command signals directly triggering the device functions. This approach can be difficult to implement when there are many functions (necessitating many command patterns) and/or the user has a considerable impairment (limited number of available signal sources). In this study, we propose a novel concept for a general-purpose HMI where the controller and the user communicate bidirectionally to select the desired function. The system first presents possible choices to the user via electro-tactile stimulation; the user then acknowledges the desired choice by generating a single command signal. Therefore, the proposed approach simplifies the user communication interface (one signal to generate), decoding (one signal to recognize), and allows selecting from a number of options. To demonstrate the new concept the method was used in one particular application, namely, to implement the control of all the relevant functions in a state of the art commercial prosthetic hand without using any myoelectric channels. We performed experiments in healthy subjects and with one amputee to test the feasibility of the novel approach. The results showed that the performance of the novel HMI concept was comparable or, for some outcome measures, better than the classic myoelectric interfaces. The presented approach has a general applicability and the obtained results point out that it could be used to operate various assistive systems (e.g., prosthesis vs. wheelchair), or it could be integrated into other control schemes (e.g., myoelectric control, brain-machine interfaces) in order to improve the usability of existing low-bandwidth HMIs. PMID:26069961
Gonzalez-Vargas, Jose; Dosen, Strahinja; Amsuess, Sebastian; Yu, Wenwei; Farina, Dario
2015-01-01
Modern assistive devices are very sophisticated systems with multiple degrees of freedom. However, an effective and user-friendly control of these systems is still an open problem since conventional human-machine interfaces (HMI) cannot easily accommodate the system's complexity. In HMIs, the user is responsible for generating unique patterns of command signals directly triggering the device functions. This approach can be difficult to implement when there are many functions (necessitating many command patterns) and/or the user has a considerable impairment (limited number of available signal sources). In this study, we propose a novel concept for a general-purpose HMI where the controller and the user communicate bidirectionally to select the desired function. The system first presents possible choices to the user via electro-tactile stimulation; the user then acknowledges the desired choice by generating a single command signal. Therefore, the proposed approach simplifies the user communication interface (one signal to generate), decoding (one signal to recognize), and allows selecting from a number of options. To demonstrate the new concept the method was used in one particular application, namely, to implement the control of all the relevant functions in a state of the art commercial prosthetic hand without using any myoelectric channels. We performed experiments in healthy subjects and with one amputee to test the feasibility of the novel approach. The results showed that the performance of the novel HMI concept was comparable or, for some outcome measures, better than the classic myoelectric interfaces. The presented approach has a general applicability and the obtained results point out that it could be used to operate various assistive systems (e.g., prosthesis vs. wheelchair), or it could be integrated into other control schemes (e.g., myoelectric control, brain-machine interfaces) in order to improve the usability of existing low-bandwidth HMIs.
Physics-based real time ground motion parameter maps: the Central Mexico example
NASA Astrophysics Data System (ADS)
Ramirez Guzman, L.; Contreras Ruiz Esparza, M. G.; Quiroz Ramirez, A.; Carrillo Lucia, M. A.; Perez Yanez, C.
2013-12-01
We present the use of near real time ground motion simulations in the generation of ground motion parameter maps for Central Mexico. Simple algorithm approaches to predict ground motion parameters of civil protection and risk engineering interest are based on the use of observed instrumental values, reported macroseismic intensities and their correlations, and ground motion prediction equations (GMPEs). A remarkable example of the use of this approach is the worldwide Shakemap generation program of the United States Geological Survey (USGS). Nevertheless, simple approaches rely strongly on the availability of instrumental and macroseismic intensity reports, as well as the accuracy of the GMPEs and the site effect amplification calculation. In regions where information is scarce, the GMPEs, a reference value in a mean sense, provide most of the ground motion information together with site effects amplification using a simple parametric approaches (e.g. the use of Vs30), and have proven to be elusive. Here we propose an approach that includes physics-based ground motion predictions (PBGMP) corrected by instrumental information using a Bayesian Kriging approach (Kitanidis, 1983) and apply it to the central region of Mexico. The method assumes: 1) the availability of a large database of low and high frequency Green's functions developed for the region of interest, using fully three-dimensional and representative one-dimension models, 2) enough real time data to obtain the centroid moment tensor and a slip rate function, and 3) a computational infrastructure that can be used to compute the source parameters and generate broadband synthetics in near real time, which will be combined with recorded instrumental data. By using a recently developed velocity model of Central Mexico and an efficient finite element octree-based implementation we generate a database of source-receiver Green's functions, valid to 0.5 Hz, that covers 160 km x 300 km x 700 km of Mexico, including a large portion of the Pacific Mexican subduction zone. A subset of the velocity and strong ground motion data available in real time is processed to obtain the source parameters to generate broadband ground motions in a dense grid ( 10 km x 10 km cells). These are interpolated later with instrumental values using a Bayesian Kriging method. Peak ground velocity and acceleration, as well as SA (T=0.1, 0.5, 1 and 2s) maps, are generated for a small set of medium to large magnitude Mexican earthquakes (Mw=5 to 7.4). We evaluate each map by comparing against stations not considered in the computation.
Movahedi, Kiavash; Wiegmann, Robert; De Vlaminck, Karen; Van Ginderachter, Jo A; Nikolaev, Viacheslav O
2018-07-01
Functional mosaic analysis allows for the direct comparison of mutant cells with differentially marked control cells in the same organism. While this offers a powerful approach for elucidating the role of specific genes or signalling pathways in cell populations of interest, genetic strategies for generating functional mosaicism remain challenging. We describe a novel and streamlined approach for functional mosaic analysis, which combines stochastic Cre/lox recombination with gene targeting in the ROSA26 locus. With the RoMo strategy a cell population of interest is randomly split into a cyan fluorescent and red fluorescent subset, of which the latter overexpresses a chosen transgene. To integrate this approach into high-throughput gene targeting initiatives, we developed a procedure that utilizes Gateway cloning for the generation of new targeting vectors. RoMo can be used for gain-of-function experiments or for altering signaling pathways in a mosaic fashion. To demonstrate this, we developed RoMo-dnGs mice, in which Cre-recombined red fluorescent cells co-express a dominant-negative Gs protein. RoMo-dnGs mice allowed us to inhibit G protein-coupled receptor activation in a fraction of cells, which could then be directly compared to differentially marked control cells in the same animal. We demonstrate how RoMo-dnGs mice can be used to obtain mosaicism in the brain and in peripheral organs for various cell types. RoMo offers an efficient new approach for functional mosaic analysis that extends the current toolbox and may reveal important new insights into in vivo gene function. © 2018 Wiley Periodicals, Inc.
Dekker, Job; Belmont, Andrew S; Guttman, Mitchell; Leshyk, Victor O; Lis, John T; Lomvardas, Stavros; Mirny, Leonid A; O'Shea, Clodagh C; Park, Peter J; Ren, Bing; Politz, Joan C Ritland; Shendure, Jay; Zhong, Sheng
2017-09-13
The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic insights into how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental technologies will be combined with biophysical approaches to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells.
Dekker, Job; Belmont, Andrew S.; Guttman, Mitchell; Leshyk, Victor O.; Lis, John T.; Lomvardas, Stavros; Mirny, Leonid A.; O’Shea, Clodagh C.; Park, Peter J.; Ren, Bing; Ritland Politz, Joan C.; Shendure, Jay; Zhong, Sheng
2017-01-01
Preface The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic understanding of how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental approaches will be combined with biophysical modeling to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells. PMID:28905911
Synthetic biology: Novel approaches for microbiology.
Padilla-Vaca, Felipe; Anaya-Velázquez, Fernando; Franco, Bernardo
2015-06-01
In the past twenty years, molecular genetics has created powerful tools for genetic manipulation of living organisms. Whole genome sequencing has provided necessary information to assess knowledge on gene function and protein networks. In addition, new tools permit to modify organisms to perform desired tasks. Gene function analysis is speed up by novel approaches that couple both high throughput data generation and mining. Synthetic biology is an emerging field that uses tools for generating novel gene networks, whole genome synthesis and engineering. New applications in biotechnological, pharmaceutical and biomedical research are envisioned for synthetic biology. In recent years these new strategies have opened up the possibilities to study gene and genome editing, creation of novel tools for functional studies in virus, parasites and pathogenic bacteria. There is also the possibility to re-design organisms to generate vaccine subunits or produce new pharmaceuticals to combat multi-drug resistant pathogens. In this review we provide our opinion on the applicability of synthetic biology strategies for functional studies of pathogenic organisms and some applications such as genome editing and gene network studies to further comprehend virulence factors and determinants in pathogenic organisms. We also discuss what we consider important ethical issues for this field of molecular biology, especially for potential misuse of the new technologies. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.
Elbasha, Elamin H
2005-05-01
The availability of patient-level data from clinical trials has spurred a lot of interest in developing methods for quantifying and presenting uncertainty in cost-effectiveness analysis (CEA). Although the majority has focused on developing methods for using sample data to estimate a confidence interval for an incremental cost-effectiveness ratio (ICER), a small strand of the literature has emphasized the importance of incorporating risk preferences and the trade-off between the mean and the variance of returns to investment in health and medicine (mean-variance analysis). This paper shows how the exponential utility-moment-generating function approach is a natural extension to this branch of the literature for modelling choices from healthcare interventions with uncertain costs and effects. The paper assumes an exponential utility function, which implies constant absolute risk aversion, and is based on the fact that the expected value of this function results in a convenient expression that depends only on the moment-generating function of the random variables. The mean-variance approach is shown to be a special case of this more general framework. The paper characterizes the solution to the resource allocation problem using standard optimization techniques and derives the summary measure researchers need to estimate for each programme, when the assumption of risk neutrality does not hold, and compares it to the standard incremental cost-effectiveness ratio. The importance of choosing the correct distribution of costs and effects and the issues related to estimation of the parameters of the distribution are also discussed. An empirical example to illustrate the methods and concepts is provided. Copyright 2004 John Wiley & Sons, Ltd
Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information
NASA Technical Reports Server (NTRS)
Brall, Aron
2013-01-01
This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.
Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2017-03-01
Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.
Generalized serial search code acquisition - The equivalent circular state diagram approach
NASA Technical Reports Server (NTRS)
Polydoros, A.; Simon, M. K.
1984-01-01
A transform-domain method for deriving the generating function of the acquisition process resulting from an arbitrary serial search strategy is presented. The method relies on equivalent circular state diagrams, uses Mason's formula from flow-graph theory, and employs a minimum number of required parameters. The transform-domain approach is briefly described and the concept of equivalent circular state diagrams is introduced and exploited to derive the generating function and resulting mean acquisition time for three particular cases of interest, the continuous/center Z search, the broken/center Z search, and the expanding window search. An optimization of the latter technique is performed whereby the number of partial windows which minimizes the mean acquisition time is determined. The numerical results satisfy certain intuitive predictions and provide useful design guidelines for such systems.
NASA Astrophysics Data System (ADS)
Poddubny, Alexander N.; Sukhorukov, Andrey A.
2015-09-01
The practical development of quantum plasmonic circuits incorporating non-classical interference [1] and sources of entangled states calls for a versatile quantum theoretical framework which can fully describe the generation and detection of entangled photons and plasmons. However, majority of the presently used theoretical approaches are typically limited to the toy models assuming loss-less and nondispersive elements or including just a few resonant modes. Here, we present a rigorous Green function approach describing entangled photon-plasmon state generation through spontaneous wave mixing in realistic metal-dielectric nanostructures. Our approach is based on the local Huttner-Barnett quantization scheme [2], which enables problem formulation in terms of a Hermitian Hamiltonian where the losses and dispersion are fully encoded in the electromagnetic Green functions. Hence, the problem can be addressed by the standard quantum mechanical perturbation theory, overcoming mathematical difficulties associated with other quantization schemes. We derive explicit expressions with clear physical meaning for the spatially dependent two-photon detection probability, single-photon detection probability and single-photon density matrix. In the limiting case of low-loss nondispersive waveguides our approach reproduces the previous results [3,4]. Importantly, our technique is far more general and can quantitatively describe generation and detection of spatially-entangled photons in arbitrary metal-dielectric structures taking into account actual losses and dispersion. This is essential to perform the design and optimization of plasmonic structures for generation and control of quantum entangled states. [1] J.S. Fakonas, H. Lee, Y.A. Kelaita and H.A. Atwater, Nature Photonics 8, 317(2014) [2] W. Vogel and D.-G. Welsch, Quantum Optics, Wiley (2006). [3] D.A. Antonosyan, A.S. Solntsev and A.A. Sukhorukov, Phys. Rev. A 90 043845 (2014) [4] L.-G. Helt, J.E. Sipe and M.J. Steel, arXiv: 1407.4219
Next-generation libraries for robust RNA interference-based genome-wide screens
Kampmann, Martin; Horlbeck, Max A.; Chen, Yuwen; Tsai, Jordan C.; Bassik, Michael C.; Gilbert, Luke A.; Villalta, Jacqueline E.; Kwon, S. Chul; Chang, Hyeshik; Kim, V. Narry; Weissman, Jonathan S.
2015-01-01
Genetic screening based on loss-of-function phenotypes is a powerful discovery tool in biology. Although the recent development of clustered regularly interspaced short palindromic repeats (CRISPR)-based screening approaches in mammalian cell culture has enormous potential, RNA interference (RNAi)-based screening remains the method of choice in several biological contexts. We previously demonstrated that ultracomplex pooled short-hairpin RNA (shRNA) libraries can largely overcome the problem of RNAi off-target effects in genome-wide screens. Here, we systematically optimize several aspects of our shRNA library, including the promoter and microRNA context for shRNA expression, selection of guide strands, and features relevant for postscreen sample preparation for deep sequencing. We present next-generation high-complexity libraries targeting human and mouse protein-coding genes, which we grouped into 12 sublibraries based on biological function. A pilot screen suggests that our next-generation RNAi library performs comparably to current CRISPR interference (CRISPRi)-based approaches and can yield complementary results with high sensitivity and high specificity. PMID:26080438
Liu, Jian; Miller, William H
2011-03-14
We show the exact expression of the quantum mechanical time correlation function in the phase space formulation of quantum mechanics. The trajectory-based dynamics that conserves the quantum canonical distribution-equilibrium Liouville dynamics (ELD) proposed in Paper I is then used to approximately evaluate the exact expression. It gives exact thermal correlation functions (of even nonlinear operators, i.e., nonlinear functions of position or momentum operators) in the classical, high temperature, and harmonic limits. Various methods have been presented for the implementation of ELD. Numerical tests of the ELD approach in the Wigner or Husimi phase space have been made for a harmonic oscillator and two strongly anharmonic model problems, for each potential autocorrelation functions of both linear and nonlinear operators have been calculated. It suggests ELD can be a potentially useful approach for describing quantum effects for complex systems in condense phase.
A machine learning approach for efficient uncertainty quantification using multiscale methods
NASA Astrophysics Data System (ADS)
Chan, Shing; Elsheikh, Ahmed H.
2018-02-01
Several multiscale methods account for sub-grid scale features using coarse scale basis functions. For example, in the Multiscale Finite Volume method the coarse scale basis functions are obtained by solving a set of local problems over dual-grid cells. We introduce a data-driven approach for the estimation of these coarse scale basis functions. Specifically, we employ a neural network predictor fitted using a set of solution samples from which it learns to generate subsequent basis functions at a lower computational cost than solving the local problems. The computational advantage of this approach is realized for uncertainty quantification tasks where a large number of realizations has to be evaluated. We attribute the ability to learn these basis functions to the modularity of the local problems and the redundancy of the permeability patches between samples. The proposed method is evaluated on elliptic problems yielding very promising results.
Nanotools for Neuroscience and Brain Activity Mapping
Alivisatos, A. Paul; Andrews, Anne M.; Boyden, Edward S.; Chun, Miyoung; Church, George M.; Deisseroth, Karl; Donoghue, John P.; Fraser, Scott E.; Lippincott-Schwartz, Jennifer; Looger, Loren L.; Masmanidis, Sotiris; McEuen, Paul L.; Nurmikko, Arto V.; Park, Hongkun; Peterka, Darcy S.; Reid, Clay; Roukes, Michael L.; Scherer, Axel; Schnitzer, Mark; Sejnowski, Terrence J.; Shepard, Kenneth L.; Tsao, Doris; Turrigiano, Gina; Weiss, Paul S.; Xu, Chris; Yuste, Rafael; Zhuang, Xiaowei
2013-01-01
Neuroscience is at a crossroads. Great effort is being invested into deciphering specific neural interactions and circuits. At the same time, there exist few general theories or principles that explain brain function. We attribute this disparity, in part, to limitations in current methodologies. Traditional neurophysiological approaches record the activities of one neuron or a few neurons at a time. Neurochemical approaches focus on single neurotransmitters. Yet, there is an increasing realization that neural circuits operate at emergent levels, where the interactions between hundreds or thousands of neurons, utilizing multiple chemical transmitters, generate functional states. Brains function at the nanoscale, so tools to study brains must ultimately operate at this scale, as well. Nanoscience and nanotechnology are poised to provide a rich toolkit of novel methods to explore brain function by enabling simultaneous measurement and manipulation of activity of thousands or even millions of neurons. We and others refer to this goal as the Brain Activity Mapping Project. In this Nano Focus, we discuss how recent developments in nanoscale analysis tools and in the design and synthesis of nanomaterials have generated optical, electrical, and chemical methods that can readily be adapted for use in neuroscience. These approaches represent exciting areas of technical development and research. Moreover, unique opportunities exist for nanoscientists, nanotechnologists, and other physical scientists and engineers to contribute to tackling the challenging problems involved in understanding the fundamentals of brain function. PMID:23514423
Integrating Microtissues in Nanofiber Scaffolds for Regenerative Nanomedicine
Keller, Laetitia; Wagner, Quentin; Offner, Damien; Eap, Sandy; Musset, Anne-Marie; Arruebo, Manuel; Kelm, Jens M.; Schwinté, Pascale; Benkirane-Jessel, Nadia
2015-01-01
A new generation of biomaterials focus on smart materials incorporating cells. Here, we describe a novel generation of synthetic nanofibrous implant functionalized with living microtissues for regenerative nanomedicine. The strategy designed here enhances the effectiveness of therapeutic implants compared to current approaches used in the clinic today based on single cells added to the implant. PMID:28793604
Sverdlov, Serge; Thompson, Elizabeth A.
2013-01-01
In classical quantitative genetics, the correlation between the phenotypes of individuals with unknown genotypes and a known pedigree relationship is expressed in terms of probabilities of IBD states. In existing approaches to the inverse problem where genotypes are observed but pedigree relationships are not, dependence between phenotypes is either modeled as Bayesian uncertainty or mapped to an IBD model via inferred relatedness parameters. Neither approach yields a relationship between genotypic similarity and phenotypic similarity with a probabilistic interpretation corresponding to a generative model. We introduce a generative model for diploid allele effect based on the classic infinite allele mutation process. This approach motivates the concept of IBF (Identity by Function). The phenotypic covariance between two individuals given their diploid genotypes is expressed in terms of functional identity states. The IBF parameters define a genetic architecture for a trait without reference to specific alleles or population. Given full genome sequences, we treat a gene-scale functional region, rather than a SNP, as a QTL, modeling patterns of dominance for multiple alleles. Applications demonstrated by simulation include phenotype and effect prediction and association, and estimation of heritability and classical variance components. A simulation case study of the Missing Heritability problem illustrates a decomposition of heritability under the IBF framework into Explained and Unexplained components. PMID:23851163
Arakawa, Christopher K; Badeau, Barry A; Zheng, Ying; DeForest, Cole A
2017-10-01
A photodegradable material-based approach to generate endothelialized 3D vascular networks within cell-laden hydrogel biomaterials is introduced. Exploiting multiphoton lithography, microchannel networks spanning nearly all size scales of native human vasculature are readily generated with unprecedented user-defined 4D control. Intraluminal channel architectures of synthetic vessels are fully customizable, providing new opportunities for next-generation microfluidics and directed cell function. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2013-01-01
A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.
Trade-Off Analysis between Concerns Based on Aspect-Oriented Requirements Engineering
NASA Astrophysics Data System (ADS)
Laurito, Abelyn Methanie R.; Takada, Shingo
The identification of functional and non-functional concerns is an important activity during requirements analysis. However, there may be conflicts between the identified concerns, and they must be discovered and resolved through trade-off analysis. Aspect-Oriented Requirements Engineering (AORE) has trade-off analysis as one of its goals, but most AORE approaches do not actually offer support for trade-off analysis; they focus on describing concerns and generating their composition. This paper proposes an approach for trade-off analysis based on AORE using use cases and the Requirements Conflict Matrix (RCM) to represent compositions. RCM shows the positive or negative effect of non-functional concerns over use cases and other non-functional concerns. Our approach is implemented within a tool called E-UCEd (Extended Use Case Editor). We also show the results of evaluating our tool.
Exploring novel objective functions for simulating muscle coactivation in the neck.
Mortensen, J; Trkov, M; Merryweather, A
2018-04-11
Musculoskeletal modeling allows for analysis of individual muscles in various situations. However, current techniques to realistically simulate muscle response when significant amounts of intentional coactivation is required are inadequate. This would include stiffening the neck or spine through muscle coactivation in preparation for perturbations or impacts. Muscle coactivation has been modeled previously in the neck and spine using optimization techniques that seek to maximize the joint stiffness by maximizing total muscle activation or muscle force. These approaches have not sought to replicate human response, but rather to explore the possible effects of active muscle. Coactivation remains a challenging feature to include in musculoskeletal models, and may be improved by extracting optimization objective functions from experimental data. However, the components of such an objective function must be known before fitting to experimental data. This study explores the effect of components in several objective functions, in order to recommend components to be used for fitting to experimental data. Four novel approaches to modeling coactivation through optimization techniques are presented, two of which produce greater levels of stiffness than previous techniques. Simulations were performed using OpenSim and MATLAB cooperatively. Results show that maximizing the moment generated by a particular muscle appears analogous to maximizing joint stiffness. The approach of optimizing for maximum moment generated by individual muscles may be a good candidate for developing objective functions that accurately simulate muscle coactivation in complex joints. This new approach will be the focus of future studies with human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.
Chen, Z; Lönnberg, T; Lahesmaa, R
2013-08-01
Current knowledge of helper T cell differentiation largely relies on data generated from mouse studies. To develop therapeutical strategies combating human diseases, understanding the molecular mechanisms how human naïve T cells differentiate to functionally distinct T helper (Th) subsets as well as studies on human differentiated Th cell subsets is particularly valuable. Systems biology approaches provide a holistic view of the processes of T helper differentiation, enable discovery of new factors and pathways involved and generation of new hypotheses to be tested to improve our understanding of human Th cell differentiation and immune-mediated diseases. Here, we summarize studies where high-throughput systems biology approaches have been exploited to human primary T cells. These studies reveal new factors and signalling pathways influencing T cell differentiation towards distinct subsets, important for immune regulation. Such information provides new insights into T cell biology and into targeting immune system for therapeutic interventions. © 2013 John Wiley & Sons Ltd.
Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs
NASA Astrophysics Data System (ADS)
Purba, J. H.
2018-02-01
Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.
Understanding the intentional acoustic behavior of humpback whales: a production-based approach.
Cazau, Dorian; Adam, Olivier; Laitman, Jeffrey T; Reidenberg, Joy S
2013-09-01
Following a production-based approach, this paper deals with the acoustic behavior of humpback whales. This approach investigates various physical factors, which are either internal (e.g., physiological mechanisms) or external (e.g., environmental constraints) to the respiratory tractus of the whale, for their implications in sound production. This paper aims to describe a functional scenario of this tractus for the generation of vocal sounds. To do so, a division of this tractus into three different configurations is proposed, based on the air recirculation process which determines air sources and laryngeal valves. Then, assuming a vocal function (in sound generation or modification) for several specific anatomical components, an acoustic characterization of each of these configurations is proposed to link different spectral features, namely, fundamental frequencies and formant structures, to specific vocal production mechanisms. A discussion around the question of whether the whale is able to fully exploit the acoustic potential of its respiratory tractus is eventually provided.
Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.
2014-01-01
The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083
dftools: Distribution function fitting
NASA Astrophysics Data System (ADS)
Obreschkow, Danail
2018-05-01
dftools, written in R, finds the most likely P parameters of a D-dimensional distribution function (DF) generating N objects, where each object is specified by D observables with measurement uncertainties. For instance, if the objects are galaxies, it can fit a mass function (D=1), a mass-size distribution (D=2) or the mass-spin-morphology distribution (D=3). Unlike most common fitting approaches, this method accurately accounts for measurement in uncertainties and complex selection functions.
Generation of Synthetic Copolymer Libraries by Combinatorial Assembly on Nucleic Acid Templates.
Kong, Dehui; Yeung, Wayland; Hili, Ryan
2016-07-11
Recent advances in nucleic acid-templated copolymerization have expanded the scope of sequence-controlled synthetic copolymers beyond the molecular architectures witnessed in nature. This has enabled the power of molecular evolution to be applied to synthetic copolymer libraries to evolve molecular function ranging from molecular recognition to catalysis. This Review seeks to summarize different approaches available to generate sequence-defined monodispersed synthetic copolymer libraries using nucleic acid-templated polymerization. Key concepts and principles governing nucleic acid-templated polymerization, as well as the fidelity of various copolymerization technologies, will be described. The Review will focus on methods that enable the combinatorial generation of copolymer libraries and their molecular evolution for desired function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kidon, Lyran; The Sackler Center for Computational Molecular and Materials Science, Tel Aviv University, Tel Aviv 69978; Wilner, Eli Y.
2015-12-21
The generalized quantum master equation provides a powerful tool to describe the dynamics in quantum impurity models driven away from equilibrium. Two complementary approaches, one based on Nakajima–Zwanzig–Mori time-convolution (TC) and the other on the Tokuyama–Mori time-convolutionless (TCL) formulations provide a starting point to describe the time-evolution of the reduced density matrix. A key in both approaches is to obtain the so called “memory kernel” or “generator,” going beyond second or fourth order perturbation techniques. While numerically converged techniques are available for the TC memory kernel, the canonical approach to obtain the TCL generator is based on inverting a super-operatormore » in the full Hilbert space, which is difficult to perform and thus, nearly all applications of the TCL approach rely on a perturbative scheme of some sort. Here, the TCL generator is expressed using a reduced system propagator which can be obtained from system observables alone and requires the calculation of super-operators and their inverse in the reduced Hilbert space rather than the full one. This makes the formulation amenable to quantum impurity solvers or to diagrammatic techniques, such as the nonequilibrium Green’s function. We implement the TCL approach for the resonant level model driven away from equilibrium and compare the time scales for the decay of the generator with that of the memory kernel in the TC approach. Furthermore, the effects of temperature, source-drain bias, and gate potential on the TCL/TC generators are discussed.« less
NASA Astrophysics Data System (ADS)
Smith, R. C.; Collins, G. S.; Hill, J.; Piggott, M. D.; Mouradian, S. L.
2015-12-01
Numerical modelling informs risk assessment of tsunami generated by submarine slides; however, for large-scale slides modelling can be complex and computationally challenging. Many previous numerical studies have approximated slides as rigid blocks that moved according to prescribed motion. However, wave characteristics are strongly dependent on the motion of the slide and previous work has recommended that more accurate representation of slide dynamics is needed. We have used the finite-element, adaptive-mesh CFD model Fluidity, to perform multi-material simulations of deformable submarine slide-generated waves at real world scales for a 2D scenario in the Gulf of Mexico. Our high-resolution approach represents slide dynamics with good accuracy, compared to other numerical simulations of this scenario, but precludes tracking of wave propagation over large distances. To enable efficient modelling of further propagation of the waves, we investigate an approach to extract information about the slide evolution from our multi-material simulations in order to drive a single-layer wave propagation model, also using Fluidity, which is much less computationally expensive. The extracted submarine slide geometry and position as a function of time are parameterised using simple polynomial functions. The polynomial functions are used to inform a prescribed velocity boundary condition in a single-layer simulation, mimicking the effect the submarine slide motion has on the water column. The approach is verified by successful comparison of wave generation in the single-layer model with that recorded in the multi-material, multi-layer simulations. We then extend this approach to 3D for further validation of this methodology (using the Gulf of Mexico scenario proposed by Horrillo et al., 2013) and to consider the effect of lateral spreading. This methodology is then used to simulate a series of hypothetical submarine slide events in the Arctic Ocean (based on evidence of historic slides) and examine the hazard posed to the UK coast.
A distributed control approach for power and energy management in a notional shipboard power system
NASA Astrophysics Data System (ADS)
Shen, Qunying
The main goal of this thesis is to present a power control module (PCON) based approach for power and energy management and to examine its control capability in shipboard power system (SPS). The proposed control scheme is implemented in a notional medium voltage direct current (MVDC) integrated power system (IPS) for electric ship. To realize the control functions such as ship mode selection, generator launch schedule, blackout monitoring, and fault ride-through, a PCON based distributed power and energy management system (PEMS) is developed. The control scheme is proposed as two-layer hierarchical architecture with system level on the top as the supervisory control and zonal level on the bottom as the decentralized control, which is based on the zonal distribution characteristic of the notional MVDC IPS that was proposed as one of the approaches for Next Generation Integrated Power System (NGIPS) by Norbert Doerry. Several types of modules with different functionalities are used to derive the control scheme in detail for the notional MVDC IPS. Those modules include the power generation module (PGM) that controls the function of generators, the power conversion module (PCM) that controls the functions of DC/DC or DC/AC converters, etc. Among them, the power control module (PCON) plays a critical role in the PEMS. It is the core of the control process. PCONs in the PEMS interact with all the other modules, such as power propulsion module (PPM), energy storage module (ESM), load shedding module (LSHED), and human machine interface (HMI) to realize the control algorithm in PEMS. The proposed control scheme is implemented in real time using the real time digital simulator (RTDS) to verify its validity. To achieve this, a system level energy storage module (SESM) and a zonal level energy storage module (ZESM) are developed in RTDS to cooperate with PCONs to realize the control functionalities. In addition, a load shedding module which takes into account the reliability of power supply (in terms of quality of service) is developed. This module can supply uninterruptible power to the mission critical loads. In addition, a multi-agent system (MAS) based framework is proposed to implement the PCON based PEMS through a hardware setup that is composed of MAMBA boards and FPGA interface. Agents are implemented using Java Agent DEvelopment Framework (JADE). Various test scenarios were tested to validate the approach.
Application of genetic algorithm in integrated setup planning and operation sequencing
NASA Astrophysics Data System (ADS)
Kafashi, Sajad; Shakeri, Mohsen
2011-01-01
Process planning is an essential component for linking design and manufacturing process. Setup planning and operation sequencing is two main tasks in process planning. Many researches solved these two problems separately. Considering the fact that the two functions are complementary, it is necessary to integrate them more tightly so that performance of a manufacturing system can be improved economically and competitively. This paper present a generative system and genetic algorithm (GA) approach to process plan the given part. The proposed approach and optimization methodology analyses the TAD (tool approach direction), tolerance relation between features and feature precedence relations to generate all possible setups and operations using workshop resource database. Based on these technological constraints the GA algorithm approach, which adopts the feature-based representation, optimizes the setup plan and sequence of operations using cost indices. Case study show that the developed system can generate satisfactory results in optimizing the setup planning and operation sequencing simultaneously in feasible condition.
Boundary-Layer Receptivity and Integrated Transition Prediction
NASA Technical Reports Server (NTRS)
Chang, Chau-Lyan; Choudhari, Meelan
2005-01-01
The adjoint parabold stability equations (PSE) formulation is used to calculate the boundary layer receptivity to localized surface roughness and suction for compressible boundary layers. Receptivity efficiency functions predicted by the adjoint PSE approach agree well with results based on other nonparallel methods including linearized Navier-Stokes equations for both Tollmien-Schlichting waves and crossflow instability in swept wing boundary layers. The receptivity efficiency function can be regarded as the Green's function to the disturbance amplitude evolution in a nonparallel (growing) boundary layer. Given the Fourier transformed geometry factor distribution along the chordwise direction, the linear disturbance amplitude evolution for a finite size, distributed nonuniformity can be computed by evaluating the integral effects of both disturbance generation and linear amplification. The synergistic approach via the linear adjoint PSE for receptivity and nonlinear PSE for disturbance evolution downstream of the leading edge forms the basis for an integrated transition prediction tool. Eventually, such physics-based, high fidelity prediction methods could simulate the transition process from the disturbance generation through the nonlinear breakdown in a holistic manner.
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
Membership generation using multilayer neural network
NASA Technical Reports Server (NTRS)
Kim, Jaeseok
1992-01-01
There has been intensive research in neural network applications to pattern recognition problems. Particularly, the back-propagation network has attracted many researchers because of its outstanding performance in pattern recognition applications. In this section, we describe a new method to generate membership functions from training data using a multilayer neural network. The basic idea behind the approach is as follows. The output values of a sigmoid activation function of a neuron bear remarkable resemblance to membership values. Therefore, we can regard the sigmoid activation values as the membership values in fuzzy set theory. Thus, in order to generate class membership values, we first train a suitable multilayer network using a training algorithm such as the back-propagation algorithm. After the training procedure converges, the resulting network can be treated as a membership generation network, where the inputs are feature values and the outputs are membership values in the different classes. This method allows fairly complex membership functions to be generated because the network is highly nonlinear in general. Also, it is to be noted that the membership functions are generated from a classification point of view. For pattern recognition applications, this is highly desirable, although the membership values may not be indicative of the degree of typicality of a feature value in a particular class.
Engineering Visual Arrestin-1 with Special Functional Characteristics*
Vishnivetskiy, Sergey A.; Chen, Qiuyan; Palazzo, Maria C.; Brooks, Evan K.; Altenbach, Christian; Iverson, Tina M.; Hubbell, Wayne L.; Gurevich, Vsevolod V.
2013-01-01
Arrestin-1 preferentially binds active phosphorylated rhodopsin. Previously, a mutant with enhanced binding to unphosphorylated active rhodopsin (Rh*) was shown to partially compensate for lack of rhodopsin phosphorylation in vivo. Here we showed that reengineering of the receptor binding surface of arrestin-1 further improves the binding to Rh* while preserving protein stability. In mammals, arrestin-1 readily self-associates at physiological concentrations. The biological role of this phenomenon can only be elucidated by replacing wild type arrestin-1 in living animals with a non-oligomerizing mutant retaining all other functions. We demonstrate that constitutively monomeric forms of arrestin-1 are sufficiently stable for in vivo expression. We also tested the idea that individual functions of arrestin-1 can be independently manipulated to generate mutants with the desired combinations of functional characteristics. Here we showed that this approach is feasible; stable forms of arrestin-1 with high Rh* binding can be generated with or without the ability to self-associate. These novel molecular tools open the possibility of testing of the biological role of arrestin-1 self-association and pave the way to elucidation of full potential of compensational approach to gene therapy of gain-of-function receptor mutations. PMID:23250748
Large deviation function for a driven underdamped particle in a periodic potential
NASA Astrophysics Data System (ADS)
Fischer, Lukas P.; Pietzonka, Patrick; Seifert, Udo
2018-02-01
Employing large deviation theory, we explore current fluctuations of underdamped Brownian motion for the paradigmatic example of a single particle in a one-dimensional periodic potential. Two different approaches to the large deviation function of the particle current are presented. First, we derive an explicit expression for the large deviation functional of the empirical phase space density, which replaces the level 2.5 functional used for overdamped dynamics. Using this approach, we obtain several bounds on the large deviation function of the particle current. We compare these to bounds for overdamped dynamics that have recently been derived, motivated by the thermodynamic uncertainty relation. Second, we provide a method to calculate the large deviation function via the cumulant generating function. We use this method to assess the tightness of the bounds in a numerical case study for a cosine potential.
Computing exact bundle compliance control charts via probability generating functions.
Chen, Binchao; Matis, Timothy; Benneyan, James
2016-06-01
Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2006-01-01
The main purpose of this article is to generate a functional model of evaluation that EMSs can be able to empower online communications characterized by imperative decision making task. The evaluation process of EMSs must merge the multicultural strategies of the theory of Media Richness, and the ethical concerns of the critical approach. Media…
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2006-01-01
The main purpose of this article is to generate a functional model of evaluation that EMSs can be able to empower online communications characterized by imperative decision making task. The evaluation process of EMSs must merge the multicultural strategies of the theory of "Media Richness," and the ethical concerns of the critical approach. "Media…
ERIC Educational Resources Information Center
Kielty, Michele L.; Gilligan, Tammy D.; Staton, A. Renee
2017-01-01
With any intervention program, involving all stakeholders in a joint effort toward implementation is most likely to lead to success. Whole-school approaches that involve school personnel, students, families, and local communities have been associated with positive, sustained outcomes. For mindfulness training programs to generate the most…
ERIC Educational Resources Information Center
Dhingra, Sunita; Angrish, Chetna
2011-01-01
Qualitative organic analysis of an unknown compound is an integral part of the university chemistry laboratory curriculum. This type of training is essential as students learn to approach a problem systematically and to interpret the results logically. However, considerable quantities of waste are generated by using conventional methods of…
Case Markers in Mongolian: A Means for Encoding Null Constituents in Noun Phrase and Relative Clause
ERIC Educational Resources Information Center
Otgonsuren, Tseden
2017-01-01
This paper focuses on the capacity of the case markers in the Mongolian language, as a relative element, to generate any finite noun phrase or relative clause based on their syntactic function or relationship. In Mongolian, there are two different approaches to generate noun phrases: parataxis and hypotaxis. According to my early observation, if…
Quantum-kinetic theory of photocurrent generation via direct and phonon-mediated optical transitions
NASA Astrophysics Data System (ADS)
Aeberhard, U.
2011-07-01
A quantum kinetic theory of direct and phonon-mediated indirect optical transitions is developed within the framework of the nonequilibrium Green’s function formalism. After validation against the standard Fermi golden rule approach in the bulk case, it is used in the simulation of photocurrent generation in ultrathin crystalline silicon p-i-n junction devices.
Fast computation of the electrolyte-concentration transfer function of a lithium-ion cell model
NASA Astrophysics Data System (ADS)
Rodríguez, Albert; Plett, Gregory L.; Trimboli, M. Scott
2017-08-01
One approach to creating physics-based reduced-order models (ROMs) of battery-cell dynamics requires first generating linearized Laplace-domain transfer functions of all cell internal electrochemical variables of interest. Then, the resulting infinite-dimensional transfer functions can be reduced by various means in order to find an approximate low-dimensional model. These methods include Padé approximation or the Discrete-Time Realization algorithm. In a previous article, Lee and colleagues developed a transfer function of the electrolyte concentration for a porous-electrode pseudo-two-dimensional lithium-ion cell model. Their approach used separation of variables and Sturm-Liouville theory to compute an infinite-series solution to the transfer function, which they then truncated to a finite number of terms for reasons of practicality. Here, we instead use a variation-of-parameters approach to arrive at a different representation of the identical solution that does not require a series expansion. The primary benefits of the new approach are speed of computation of the transfer function and the removal of the requirement to approximate the transfer function by truncating the number of terms evaluated. Results show that the speedup of the new method can be more than 3800.
GPU based contouring method on grid DEM data
NASA Astrophysics Data System (ADS)
Tan, Liheng; Wan, Gang; Li, Feng; Chen, Xiaohui; Du, Wenlong
2017-08-01
This paper presents a novel method to generate contour lines from grid DEM data based on the programmable GPU pipeline. The previous contouring approaches often use CPU to construct a finite element mesh from the raw DEM data, and then extract contour segments from the elements. They also need a tracing or sorting strategy to generate the final continuous contours. These approaches can be heavily CPU-costing and time-consuming. Meanwhile the generated contours would be unsmooth if the raw data is sparsely distributed. Unlike the CPU approaches, we employ the GPU's vertex shader to generate a triangular mesh with arbitrary user-defined density, in which the height of each vertex is calculated through a third-order Cardinal spline function. Then in the same frame, segments are extracted from the triangles by the geometry shader, and translated to the CPU-side with an internal order in the GPU's transform feedback stage. Finally we propose a "Grid Sorting" algorithm to achieve the continuous contour lines by travelling the segments only once. Our method makes use of multiple stages of GPU pipeline for computation, which can generate smooth contour lines, and is significantly faster than the previous CPU approaches. The algorithm can be easily implemented with OpenGL 3.3 API or higher on consumer-level PCs.
Scalable, full-colour and controllable chromotropic plasmonic printing
Xue, Jiancai; Zhou, Zhang-Kai; Wei, Zhiqiang; Su, Rongbin; Lai, Juan; Li, Juntao; Li, Chao; Zhang, Tengwei; Wang, Xue-Hua
2015-01-01
Plasmonic colour printing has drawn wide attention as a promising candidate for the next-generation colour-printing technology. However, an efficient approach to realize full colour and scalable fabrication is still lacking, which prevents plasmonic colour printing from practical applications. Here we present a scalable and full-colour plasmonic printing approach by combining conjugate twin-phase modulation with a plasmonic broadband absorber. More importantly, our approach also demonstrates controllable chromotropic capability, that is, the ability of reversible colour transformations. This chromotropic capability affords enormous potentials in building functionalized prints for anticounterfeiting, special label, and high-density data encryption storage. With such excellent performances in functional colour applications, this colour-printing approach could pave the way for plasmonic colour printing in real-world commercial utilization. PMID:26567803
Yeast Two-Hybrid: State of the Art
Beyaert, Rudi
1999-01-01
Genome projects are approaching completion and are saturating sequence databases. This paper discusses the role of the two-hybrid system as a generator of hypotheses. Apart from this rather exhaustive, financially and labour intensive procedure, more refined functional studies can be undertaken. Indeed, by making hybrids of two-hybrid systems, customised approaches can be developed in order to attack specific function-related problems. For example, one could set-up a "differential" screen by combining a forward and a reverse approach in a three-hybrid set-up. Another very interesting project is the use of peptide libraries in two-hybrid approaches. This could enable the identification of peptides with very high specificity comparable to "real" antibodies. With the technology available, the only limitation is imagination. PMID:12734586
Scalable, full-colour and controllable chromotropic plasmonic printing.
Xue, Jiancai; Zhou, Zhang-Kai; Wei, Zhiqiang; Su, Rongbin; Lai, Juan; Li, Juntao; Li, Chao; Zhang, Tengwei; Wang, Xue-Hua
2015-11-16
Plasmonic colour printing has drawn wide attention as a promising candidate for the next-generation colour-printing technology. However, an efficient approach to realize full colour and scalable fabrication is still lacking, which prevents plasmonic colour printing from practical applications. Here we present a scalable and full-colour plasmonic printing approach by combining conjugate twin-phase modulation with a plasmonic broadband absorber. More importantly, our approach also demonstrates controllable chromotropic capability, that is, the ability of reversible colour transformations. This chromotropic capability affords enormous potentials in building functionalized prints for anticounterfeiting, special label, and high-density data encryption storage. With such excellent performances in functional colour applications, this colour-printing approach could pave the way for plasmonic colour printing in real-world commercial utilization.
Conceptual design optimization study
NASA Technical Reports Server (NTRS)
Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.
1990-01-01
The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.
Jayasinghe, Suwan N
2013-04-21
Recent years have seen interest in approaches for directly generating fibers and scaffolds following a rising trend for their exploration in the health sciences. In this review the author wishes to briefly highlight the many approaches explored to date for generating such structures, while underlining their advantages and disadvantages, and their contribution in particular to the biomedical sciences. Such structures have been demonstrated as having implications in both the laboratory and the clinic, as they mimic the native extra cellular matrix. Interestingly the only materials investigated until very recently for generating fibrous architectures employed either natural or synthetic polymers with or without the addition of functional molecule(s). Arguably although such constructs have been demonstrated to have many applications, they lack the one unit most important for carrying out the ability to directly reconstruct a three-dimensional functional tissue, namely living cells. Therefore recent findings have demonstrated the ability to directly form cell-laden fibers and scaffolds in useful quantities from which functional three-dimensional living tissues can be conceived. These recent developments have far-reaching ramifications to many areas of research and development, a few of which range from tissue engineering and regenerative medicine, a novel approach to analyzing cell behavior and function in real time in three-dimensions, to the advanced controlled and targeted delivery of experimental and/or medical cells and/or genes for localized treatment. At present these developments have passed all in vitro and in vivo mouse model based challenge trials and are now spearheading their journey towards initiating human clinical trials.
A functional perspective on social marketing: insights from Israel's bicycle helmet campaign.
Ressler, W H; Toledo, E
1997-01-01
This article examines the functional approach to attitudes for its potential contribution to improving models of attitude-behavior consistency and to demonstrate its potential application to social marketing. To this end, a study of children's attitudes toward bicycle helmets is reported on and its results examined. The study was undertaken to plan Israel's first-ever media campaign to encourage the use of helmets by children. Responses of the 783 Israeli children (ages 7 to 14 years) who participated in the study are analyzed to test the hypothesis generated by this application of functional theory--that children's attitudes toward wearing bicycle helmets serve primarily an expressive function. The results suggest cautious support for the functional hypothesis. In conclusion, possible extensions of this approach to other areas of social marketing are discussed.
Custom implant design for large cranial defects.
Marreiros, Filipe M M; Heuzé, Y; Verius, M; Unterhofer, C; Freysinger, W; Recheis, W
2016-12-01
The aim of this work was to introduce a computer-aided design (CAD) tool that enables the design of large skull defect (>100 [Formula: see text]) implants. Functional and aesthetically correct custom implants are extremely important for patients with large cranial defects. For these cases, preoperative fabrication of implants is recommended to avoid problems of donor site morbidity, sufficiency of donor material and quality. Finally, crafting the correct shape is a non-trivial task increasingly complicated by defect size. We present a CAD tool to design such implants for the neurocranium. A combination of geometric morphometrics and radial basis functions, namely thin-plate splines, allows semiautomatic implant generation. The method uses symmetry and the best fitting shape to estimate missing data directly within the radiologic volume data. In addition, this approach delivers correct implant fitting via a boundary fitting approach. This method generates a smooth implant surface, free of sharp edges that follows the main contours of the boundary, enabling accurate implant placement in the defect site intraoperatively. The present approach is evaluated and compared to existing methods. A mean error of 89.29 % (72.64-100 %) missing landmarks with an error less or equal to 1 mm was obtained. In conclusion, the results show that our CAD tool can generate patient-specific implants with high accuracy.
Protein Structure and Function Prediction Using I-TASSER
Yang, Jianyi; Zhang, Yang
2016-01-01
I-TASSER is a hierarchical protocol for automated protein structure prediction and structure-based function annotation. Starting from the amino acid sequence of target proteins, I-TASSER first generates full-length atomic structural models from multiple threading alignments and iterative structural assembly simulations followed by atomic-level structure refinement. The biological functions of the protein, including ligand-binding sites, enzyme commission number, and gene ontology terms, are then inferred from known protein function databases based on sequence and structure profile comparisons. I-TASSER is freely available as both an on-line server and a stand-alone package. This unit describes how to use the I-TASSER protocol to generate structure and function prediction and how to interpret the prediction results, as well as alternative approaches for further improving the I-TASSER modeling quality for distant-homologous and multi-domain protein targets. PMID:26678386
Simple area determination of strongly overlapping ion mobility peaks.
Borovcová, Lucie; Hermannová, Martina; Pauk, Volodymyr; Šimek, Matěj; Havlíček, Vladimír; Lemr, Karel
2017-08-15
Coupling of ion mobility with mass spectrometry has brought new frontiers in separation and quantitation of a wide range of isobaric/isomeric compounds. Ion mobility spectrometry may separate ions possessing the identical molecular formula but having different molecular shapes. The separation space in most commercially available instruments is limited and rarely the mobility resolving power exceeds one hundred. From this perspective, new approaches allowing for extracting individual compound signals out of a more complex mixture are needed. In this work we present a new simple analytical approach based on fitting of arrival time distribution (ATD) profiles by Gaussian functions and generating of ATD functions. These ATD functions well describe even distorted ion mobility peaks of individual compounds and allow for extracting their peaks from mobilograms of mixtures. Contrary to classical integration, our approach works well with irregular overlapping peaks. Using mobilograms of standards to generate ATD functions, poorly separated compounds, e.g. isomers, with identical mass spectra representing a hard to solve task for various chemometric methods can be easily distinguished by our procedure. Alternatively ATD functions can be obtained from ATD profiles of ions unique to individual mixture components (if such ions exist) and mobilograms of standards are not required. On a set of hyaluronan-derived oligosaccharides we demonstrated excellent ATD repeatability enabling the resolution of binary mixtures, including mixtures with minor component level about 5%. Ion mobility quantitative data of isomers were confirmed by high performance liquid chromatography. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Zhangjun; Liu, Zenghui
2018-06-01
This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.
Hsu, Wei-Feng; Lin, Shih-Chih
2018-01-01
This paper presents a novel approach to optimizing the design of phase-only computer-generated holograms (CGH) for the creation of binary images in an optical Fourier transform system. Optimization begins by selecting an image pixel with a temporal change in amplitude. The modulated image function undergoes an inverse Fourier transform followed by the imposition of a CGH constraint and the Fourier transform to yield an image function associated with the change in amplitude of the selected pixel. In iterations where the quality of the image is improved, that image function is adopted as the input for the next iteration. In cases where the image quality is not improved, the image function before the pixel changed is used as the input. Thus, the proposed approach is referred to as the pixelwise hybrid input-output (PHIO) algorithm. The PHIO algorithm was shown to achieve image quality far exceeding that of the Gerchberg-Saxton (GS) algorithm. The benefits were particularly evident when the PHIO algorithm was equipped with a dynamic range of image intensities equivalent to the amplitude freedom of the image signal. The signal variation of images reconstructed from the GS algorithm was 1.0223, but only 0.2537 when using PHIO, i.e., a 75% improvement. Nonetheless, the proposed scheme resulted in a 10% degradation in diffraction efficiency and signal-to-noise ratio.
Mitochondrial functionality in female reproduction.
Gąsior, Łukasz; Daszkiewicz, Regina; Ogórek, Mateusz; Polański, Zbigniew
2017-01-04
In most animal species female germ cells are the source of mitochondrial genome for the whole body of individuals. As a source of mitochondrial DNA for future generations the mitochondria in the female germ line undergo dynamic quantitative and qualitative changes. In addition to maintaining the intact template of mitochondrial genome from one generation to another, mitochondrial role in oocytes is much more complex and pleiotropic. The quality of mitochondria determines the ability of meiotic divisions, fertilization ability, and activation after fertilization or sustaining development of a new embryo. The presence of normal number of functional mitochondria is also crucial for proper implantation and pregnancy maintaining. This article addresses issues of mitochondrial role and function in mammalian oocyte and presents new approaches in studies of mitochondrial function in female germ cells.
NASA Astrophysics Data System (ADS)
Lee, Deuk Yeon; Choi, Jae Hong; Shin, Jung Chul; Jung, Man Ki; Song, Seok Kyun; Suh, Jung Ki; Lee, Chang Young
2018-06-01
Compared with wet processes, dry functionalization using plasma is fast, scalable, solvent-free, and thus presents a promising approach for grafting functional groups to powdery nanomaterials. Previous approaches, however, had difficulties in maintaining an intimate sample-plasma contact and achieving uniform functionalization. Here, we demonstrate a plasma reactor equipped with a porous filter electrode that increases both homogeneity and degree of functionalization by capturing and circulating powdery carbon nanotubes (CNTs) via vacuum and gas blowing. Spectroscopic measurements verify that treatment with O2/air plasma generates oxygen-containing groups on the surface of CNTs, with the degree of functionalization readily controlled by varying the circulation number. Gas sensors fabricated using the plasma-treated CNTs confirm alteration of molecular adsorption on the surface of CNTs. A sequential treatment with NH3 plasma following the oxidation pre-treatment results in the functionalization with nitrogen species of up to 3.2 wt%. Our approach requiring no organic solvents not only is cost-effective and environmentally friendly, but also serves as a versatile tool that applies to other powdery micro or nanoscale materials for controlled modification of their surfaces.
Neoclassic drug discovery: the case for lead generation using phenotypic and functional approaches.
Lee, Jonathan A; Berg, Ellen L
2013-12-01
Innovation and new molecular entity production by the pharmaceutical industry has been below expectations. Surprisingly, more first-in-class small-molecule drugs approved by the U.S. Food and Drug Administration (FDA) between 1999 and 2008 were identified by functional phenotypic lead generation strategies reminiscent of pre-genomics pharmacology than contemporary molecular targeted strategies that encompass the vast majority of lead generation efforts. This observation, in conjunction with the difficulty in validating molecular targets for drug discovery, has diminished the impact of the "genomics revolution" and has led to a growing grassroots movement and now broader trend in pharma to reconsider the use of modern physiology-based or phenotypic drug discovery (PDD) strategies. This "From the Guest Editors" column provides an introduction and overview of the two-part special issues of Journal of Biomolecular Screening on PDD. Terminology and the business case for use of PDD are defined. Key issues such as assay performance, chemical optimization, target identification, and challenges to the organization and implementation of PDD are discussed. Possible solutions for these challenges and a new neoclassic vision for PDD that combines phenotypic and functional approaches with technology innovations resulting from the genomics-driven era of target-based drug discovery (TDD) are also described. Finally, an overview of the manuscripts in this special edition is provided.
Gröbner Bases and Generation of Difference Schemes for Partial Differential Equations
NASA Astrophysics Data System (ADS)
Gerdt, Vladimir P.; Blinkov, Yuri A.; Mozzhilkin, Vladimir V.
2006-05-01
In this paper we present an algorithmic approach to the generation of fully conservative difference schemes for linear partial differential equations. The approach is based on enlargement of the equations in their integral conservation law form by extra integral relations between unknown functions and their derivatives, and on discretization of the obtained system. The structure of the discrete system depends on numerical approximation methods for the integrals occurring in the enlarged system. As a result of the discretization, a system of linear polynomial difference equations is derived for the unknown functions and their partial derivatives. A difference scheme is constructed by elimination of all the partial derivatives. The elimination can be achieved by selecting a proper elimination ranking and by computing a Gröbner basis of the linear difference ideal generated by the polynomials in the discrete system. For these purposes we use the difference form of Janet-like Gröbner bases and their implementation in Maple. As illustration of the described methods and algorithms, we construct a number of difference schemes for Burgers and Falkowich-Karman equations and discuss their numerical properties.
Kalra, Kunal; Chandrabose, Srijaya Thekkeparambil; Ramasamy, Thamil Selvee; Kasim, Noor Hayaty Binti Abu
2018-06-04
Diabetes mellitus is one of the leading cause for death worldwide. Loss and functional failure of pancreatic β-cells, the parenchyma cells in the islets of Langerhans onsets and progresses diabetes mellitus. The increasing incidence of this metabolic disorder necessitates efficient strategies to produce functional β-cells for treating diabetes mellitus. Human induced pluripotent stem cells (hiPSC), holds potential for treating diabetes owning to their self-renewal capacity and ability to differentiate into β-cells. iPSC technology also provides unlimited starting material to generate differentiated cells for regenerative applications. Progress has also been made in establishing in-vitro culture protocols to yield definitive endoderm, pancreatic endoderm progenitor cells and β-cells via different reprogramming strategies and growth factor supplementation. However, these generated β-cells are still immature, lack functional characteristics and exhibit lower capability in reversing the diseases conditions. Current methods employed to generate mature and functional β-cells include; use of small and large molecules to enhance the reprogramming and differentiation efficiency, 3D culture systems to improve the functional properties and heterogeneity of differentiated cells. This review details recent advancements in the generation of mature β-cells by reprogramming stem cells into iPSCs that is further programmed to β-cells. It also provides deeper insight of current reprogramming protocols and their efficacy, focusing on the underlying mechanism of chemical based approach to generate iPSCs. Furthermore, we have highlighted the recent differentiation strategies both in-vitro and in-vivo to date and the future prospects in generation of mature β-cells. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2015-04-01
Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.
Bessel beam CARS of axially structured samples
NASA Astrophysics Data System (ADS)
Heuke, Sandro; Zheng, Juanjuan; Akimov, Denis; Heintzmann, Rainer; Schmitt, Michael; Popp, Jürgen
2015-06-01
We report about a Bessel beam CARS approach for axial profiling of multi-layer structures. This study presents an experimental implementation for the generation of CARS by Bessel beam excitation using only passive optical elements. Furthermore, an analytical expression is provided describing the generated anti-Stokes field by a homogeneous sample. Based on the concept of coherent transfer functions, the underling resolving power of axially structured geometries is investigated. It is found that through the non-linearity of the CARS process in combination with the folded illumination geometry continuous phase-matching is achieved starting from homogeneous samples up to spatial sample frequencies at twice of the pumping electric field wave. The experimental and analytical findings are modeled by the implementation of the Debye Integral and scalar Green function approach. Finally, the goal of reconstructing an axially layered sample is demonstrated on the basis of the numerically simulated modulus and phase of the anti-Stokes far-field radiation pattern.
Bessel beam CARS of axially structured samples.
Heuke, Sandro; Zheng, Juanjuan; Akimov, Denis; Heintzmann, Rainer; Schmitt, Michael; Popp, Jürgen
2015-06-05
We report about a Bessel beam CARS approach for axial profiling of multi-layer structures. This study presents an experimental implementation for the generation of CARS by Bessel beam excitation using only passive optical elements. Furthermore, an analytical expression is provided describing the generated anti-Stokes field by a homogeneous sample. Based on the concept of coherent transfer functions, the underling resolving power of axially structured geometries is investigated. It is found that through the non-linearity of the CARS process in combination with the folded illumination geometry continuous phase-matching is achieved starting from homogeneous samples up to spatial sample frequencies at twice of the pumping electric field wave. The experimental and analytical findings are modeled by the implementation of the Debye Integral and scalar Green function approach. Finally, the goal of reconstructing an axially layered sample is demonstrated on the basis of the numerically simulated modulus and phase of the anti-Stokes far-field radiation pattern.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rossi, Tuomas P., E-mail: tuomas.rossi@alumni.aalto.fi; Sakko, Arto; Puska, Martti J.
We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate thatmore » the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.« less
Centralized and Decentralized Control for Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Samaan, Nader A.; Diao, Ruisheng
2011-04-29
Demand response has been recognized as an essential element of the smart grid. Frequency response, regulation and contingency reserve functions performed traditionally by generation resources are now starting to involve demand side resources. Additional benefits from demand response include peak reduction and load shifting, which will defer new infrastructure investment and improve generator operation efficiency. Technical approaches designed to realize these functionalities can be categorized into centralized control and decentralized control, depending on where the response decision is made. This paper discusses these two control philosophies and compares their relative advantages and disadvantages in terms of delay time, predictability, complexity,more » and reliability. A distribution system model with detailed household loads and controls is built to demonstrate the characteristics of the two approaches. The conclusion is that the promptness and reliability of decentralized control should be combined with the predictability and simplicity of centralized control to achieve the best performance of the smart grid.« less
Kireeva, N; Baskin, I I; Gaspar, H A; Horvath, D; Marcou, G; Varnek, A
2012-04-01
Here, the utility of Generative Topographic Maps (GTM) for data visualization, structure-activity modeling and database comparison is evaluated, on hand of subsets of the Database of Useful Decoys (DUD). Unlike other popular dimensionality reduction approaches like Principal Component Analysis, Sammon Mapping or Self-Organizing Maps, the great advantage of GTMs is providing data probability distribution functions (PDF), both in the high-dimensional space defined by molecular descriptors and in 2D latent space. PDFs for the molecules of different activity classes were successfully used to build classification models in the framework of the Bayesian approach. Because PDFs are represented by a mixture of Gaussian functions, the Bhattacharyya kernel has been proposed as a measure of the overlap of datasets, which leads to an elegant method of global comparison of chemical libraries. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Hamza, Karim; Shalaby, Mohamed
2014-09-01
This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new 'infill' sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.
Asymptotics of bivariate generating functions with algebraic singularities
NASA Astrophysics Data System (ADS)
Greenwood, Torin
Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirlik, G; D’Souza, W; Zhang, H
2016-06-15
Purpose: To present a novel multi-criteria optimization (MCO) solution approach that generates treatment plans with deliverable apertures using column generation. Methods: We demonstrate our method with 10 locally advanced head-and-neck cancer cases retrospectively. In our MCO formulation, we defined an objective function for each structure in the treatment volume. This resulted in 9 objective functions, including 3 distinct objectives for primary target volume, high-risk and low-risk target volumes, 5 objectives for each of the organs-at-risk (OARs) (two parotid glands, spinal cord, brain stem and oral cavity), and one for the non-target non-OAR normal tissue. Conditional value-at-risk (CVaR) constraints were utilizedmore » to ensure at least certain fraction of the target volumes receiving the prescription doses. To directly generate deliverable plans, column generation algorithm was embedded within our MCO approach for aperture shape generation. Final dose distributions for all plans were generated using a Monte Carlo kernel-superposition dose calculation. We compared the MCO plans with the clinical plans, which were created by clinicians. Results: At least 95% target coverage was achieved by both MCO plans and clinical plans. However, the average conformity indices of clinical plans and the MCO plans were 1.95 and 1.35, respectively (31% reduction, p<0.01). Compared to the conventional clinical plan, the proposed MCO method achieved average reductions in left parotid mean dose of 5% (p=0.06), right parotid mean dose of 18% (p<0.01), oral cavity mean dose of 21% (p=0.03), spinal cord maximum dose of 20% (p<0.01), brain stem maximum dose of 61% (p<0.01), and normal tissue maximum dose of 5% (p<0.01), respectively. Conclusion: We demonstrated that the proposed MCO method was able to obtain deliverable IMRT treatment plans while achieving significant improvements in dosimetric plan quality.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kravtsov, V.E., E-mail: kravtsov@ictp.it; Landau Institute for Theoretical Physics, 2 Kosygina st., 117940 Moscow; Yudson, V.I., E-mail: yudson@isan.troitsk.ru
Highlights: > Statistics of normalized eigenfunctions in one-dimensional Anderson localization at E = 0 is studied. > Moments of inverse participation ratio are calculated. > Equation for generating function is derived at E = 0. > An exact solution for generating function at E = 0 is obtained. > Relation of the generating function to the phase distribution function is established. - Abstract: The one-dimensional (1d) Anderson model (AM), i.e. a tight-binding chain with random uncorrelated on-site energies, has statistical anomalies at any rational point f=(2a)/({lambda}{sub E}) , where a is the lattice constant and {lambda}{sub E} is the demore » Broglie wavelength. We develop a regular approach to anomalous statistics of normalized eigenfunctions {psi}(r) at such commensurability points. The approach is based on an exact integral transfer-matrix equation for a generating function {Phi}{sub r}(u, {phi}) (u and {phi} have a meaning of the squared amplitude and phase of eigenfunctions, r is the position of the observation point). This generating function can be used to compute local statistics of eigenfunctions of 1d AM at any disorder and to address the problem of higher-order anomalies at f=p/q with q > 2. The descender of the generating function P{sub r}({phi}){identical_to}{Phi}{sub r}(u=0,{phi}) is shown to be the distribution function of phase which determines the Lyapunov exponent and the local density of states. In the leading order in the small disorder we derived a second-order partial differential equation for the r-independent ('zero-mode') component {Phi}(u, {phi}) at the E = 0 (f=1/2 ) anomaly. This equation is nonseparable in variables u and {phi}. Yet, we show that due to a hidden symmetry, it is integrable and we construct an exact solution for {Phi}(u, {phi}) explicitly in quadratures. Using this solution we computed moments I{sub m} = N< vertical bar {psi} vertical bar {sup 2m}> (m {>=} 1) for a chain of the length N {yields} {infinity} and found an essential difference between their m-behavior in the center-of-band anomaly and for energies outside this anomaly. Outside the anomaly the 'extrinsic' localization length defined from the Lyapunov exponent coincides with that defined from the inverse participation ratio ('intrinsic' localization length). This is not the case at the E = 0 anomaly where the extrinsic localization length is smaller than the intrinsic one. At E = 0 one also observes an anomalous enhancement of large moments compatible with existence of yet another, much smaller characteristic length scale.« less
A data-driven wavelet-based approach for generating jumping loads
NASA Astrophysics Data System (ADS)
Chen, Jun; Li, Guo; Racic, Vitomir
2018-06-01
This paper suggests an approach to generate human jumping loads using wavelet transform and a database of individual jumping force records. A total of 970 individual jumping force records of various frequencies were first collected by three experiments from 147 test subjects. For each record, every jumping pulse was extracted and decomposed into seven levels by wavelet transform. All the decomposition coefficients were stored in an information database. Probability distributions of jumping cycle period, contact ratio and energy of the jumping pulse were statistically analyzed. Inspired by the theory of DNA recombination, an approach was developed by interchanging the wavelet coefficients between different jumping pulses. To generate a jumping force time history with N pulses, wavelet coefficients were first selected randomly from the database at each level. They were then used to reconstruct N pulses by the inverse wavelet transform. Jumping cycle periods and contract ratios were then generated randomly based on their probabilistic functions. These parameters were assigned to each of the N pulses which were in turn scaled by the amplitude factors βi to account for energy relationship between successive pulses. The final jumping force time history was obtained by linking all the N cycles end to end. This simulation approach can preserve the non-stationary features of the jumping load force in time-frequency domain. Application indicates that this approach can be used to generate jumping force time history due to single people jumping and also can be extended further to stochastic jumping loads due to groups and crowds.
Atom and Bond Fukui Functions and Matrices: A Hirshfeld-I Atoms-in-Molecule Approach.
Oña, Ofelia B; De Clercq, Olivier; Alcoba, Diego R; Torre, Alicia; Lain, Luis; Van Neck, Dimitri; Bultinck, Patrick
2016-09-19
The Fukui function is often used in its atom-condensed form by isolating it from the molecular Fukui function using a chosen weight function for the atom in the molecule. Recently, Fukui functions and matrices for both atoms and bonds separately were introduced for semiempirical and ab initio levels of theory using Hückel and Mulliken atoms-in-molecule models. In this work, a double partitioning method of the Fukui matrix is proposed within the Hirshfeld-I atoms-in-molecule framework. Diagonalizing the resulting atomic and bond matrices gives eigenvalues and eigenvectors (Fukui orbitals) describing the reactivity of atoms and bonds. The Fukui function is the diagonal element of the Fukui matrix and may be resolved in atom and bond contributions. The extra information contained in the atom and bond resolution of the Fukui matrices and functions is highlighted. The effect of the choice of weight function arising from the Hirshfeld-I approach to obtain atom- and bond-condensed Fukui functions is studied. A comparison of the results with those generated by using the Mulliken atoms-in-molecule approach shows low correlation between the two partitioning schemes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
|Vus| determination from inclusive strange tau decay and lattice HVP
NASA Astrophysics Data System (ADS)
Boyle, Peter; Hudspith, Renwick James; Izubuchi, Taku; Jüttner, Andreas; Lehner, Christoph; Lewis, Randy; Maltman, Kim; Ohki, Hiroshi; Portelli, Antonin; Spraggs, Matthew
2018-03-01
We propose and apply a novel approach to determining |Vus| which uses inclusive strange hadronic tau decay data and hadronic vacuum polarization functions (HVPs) computed on the lattice. The experimental and lattice data are related through dispersion relations which employ a class of weight functions having poles at space-like momentum. Implementing this approach using lattice data generated by the RBC/UKQCD collaboration, we show examples of weight functions which strongly suppress spectral integral contributions from the region where experimental data either have large uncertainties or do not exist while at the same time allowing accurate determinations of relevant lattice HVPs. Our result for |Vus| is in good agreement with determinations from K physics and 3-family CKM unitarity. The advantages of the new approach over the conventional sum rule analysis will be discussed.
A Rigorous Framework for Optimization of Expensive Functions by Surrogates
NASA Technical Reports Server (NTRS)
Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.
1998-01-01
The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.
Torres, Edmanuel; DiLabio, Gino A
2013-08-13
Large clusters of noncovalently bonded molecules can only be efficiently modeled by classical mechanics simulations. One prominent challenge associated with this approach is obtaining force-field parameters that accurately describe noncovalent interactions. High-level correlated wave function methods, such as CCSD(T), are capable of correctly predicting noncovalent interactions, and are widely used to produce reference data. However, high-level correlated methods are generally too computationally costly to generate the critical reference data required for good force-field parameter development. In this work we present an approach to generate Lennard-Jones force-field parameters to accurately account for noncovalent interactions. We propose the use of a computational step that is intermediate to CCSD(T) and classical molecular mechanics, that can bridge the accuracy and computational efficiency gap between them, and demonstrate the efficacy of our approach with methane clusters. On the basis of CCSD(T)-level binding energy data for a small set of methane clusters, we develop methane-specific, atom-centered, dispersion-correcting potentials (DCPs) for use with the PBE0 density-functional and 6-31+G(d,p) basis sets. We then use the PBE0-DCP approach to compute a detailed map of the interaction forces associated with the removal of a single methane molecule from a cluster of eight methane molecules and use this map to optimize the Lennard-Jones parameters for methane. The quality of the binding energies obtained by the Lennard-Jones parameters we obtained is assessed on a set of methane clusters containing from 2 to 40 molecules. Our Lennard-Jones parameters, used in combination with the intramolecular parameters of the CHARMM force field, are found to closely reproduce the results of our dispersion-corrected density-functional calculations. The approach outlined can be used to develop Lennard-Jones parameters for any kind of molecular system.
Dissecting psychiatric spectrum disorders by generative embedding☆☆☆
Brodersen, Kay H.; Deserno, Lorenz; Schlagenhauf, Florian; Lin, Zhihao; Penny, Will D.; Buhmann, Joachim M.; Stephan, Klaas E.
2013-01-01
This proof-of-concept study examines the feasibility of defining subgroups in psychiatric spectrum disorders by generative embedding, using dynamical system models which infer neuronal circuit mechanisms from neuroimaging data. To this end, we re-analysed an fMRI dataset of 41 patients diagnosed with schizophrenia and 42 healthy controls performing a numerical n-back working-memory task. In our generative-embedding approach, we used parameter estimates from a dynamic causal model (DCM) of a visual–parietal–prefrontal network to define a model-based feature space for the subsequent application of supervised and unsupervised learning techniques. First, using a linear support vector machine for classification, we were able to predict individual diagnostic labels significantly more accurately (78%) from DCM-based effective connectivity estimates than from functional connectivity between (62%) or local activity within the same regions (55%). Second, an unsupervised approach based on variational Bayesian Gaussian mixture modelling provided evidence for two clusters which mapped onto patients and controls with nearly the same accuracy (71%) as the supervised approach. Finally, when restricting the analysis only to the patients, Gaussian mixture modelling suggested the existence of three patient subgroups, each of which was characterised by a different architecture of the visual–parietal–prefrontal working-memory network. Critically, even though this analysis did not have access to information about the patients' clinical symptoms, the three neurophysiologically defined subgroups mapped onto three clinically distinct subgroups, distinguished by significant differences in negative symptom severity, as assessed on the Positive and Negative Syndrome Scale (PANSS). In summary, this study provides a concrete example of how psychiatric spectrum diseases may be split into subgroups that are defined in terms of neurophysiological mechanisms specified by a generative model of network dynamics such as DCM. The results corroborate our previous findings in stroke patients that generative embedding, compared to analyses of more conventional measures such as functional connectivity or regional activity, can significantly enhance both the interpretability and performance of computational approaches to clinical classification. PMID:24363992
Phase pupil functions for focal-depth enhancement derived from a Wigner distribution function.
Zalvidea, D; Sicre, E E
1998-06-10
A method for obtaining phase-retardation functions, which give rise to an increase of the image focal depth, is proposed. To this end, the Wigner distribution function corresponding to a specific aperture that has an associated small depth of focus in image space is conveniently sheared in the phase-space domain to generate a new Wigner distribution function. From this new function a more uniform on-axis image irradiance can be accomplished. This approach is illustrated by comparison of the imaging performance of both the derived phase function and a previously reported logarithmic phase distribution.
Essays in market power mitigation and supply function equilibrium
NASA Astrophysics Data System (ADS)
Subramainam, Thiagarajah Natchie
Market power mitigation has been an integral part of wholesale electricity markets since deregulation. In wholesale electricity markets, different regions in the US take different approaches to regulating market power. While the exercise of market power has received considerable attention in the literature, the issue of market power mitigation has attracted scant attention. In the first chapter, I examine the market power mitigation rules used in New York ISO (Independent System Operator) and California ISO (CAISO) with respect to day-ahead and real-time energy markets. I test whether markups associated with New York in-city generators would be lower with an alternative approach to mitigation, the CAISO approach. Results indicate the difference in markups between these two mitigation rules is driven by the shape of residual demand curves for suppliers. Analysis of residual demand curves faced by New York in-city suppliers show similar markups under both mitigation rules when no one supplier is necessary to meet the demand (i.e., when no supplier is pivotal). However, when some supplier is crucial for the market to clear, the mitigation rule adopted by the NYISO consistently leads to higher markups than would the CAISO rule. This result suggest that market power episodes in New York is confined to periods where some supplier is pivotal. As a result, I find that applying the CAISOs' mitigation rules to the New York market could lower wholesale electricity prices by 18%. The second chapter of my dissertation focuses on supply function equilibrium. In power markets, suppliers submit offer curves in auctions, indicating their willingness to supply at different price levels. Although firms are allowed to submit different offer curves for different time periods, surprisingly many firms stick to a single offer curve for the entire day. This essentially means that firms are submitting a single offer curve for multiple demand realizations. A suitable framework to analyze such oligopolistic competition between power market suppliers is supply function equilibrium models. Using detailed bidding data, I develop equilibrium in supply functions by restricting supplier offers to a class of supply functions. By collating equilibrium supply functions corresponding to different realizations of demand, I obtain a single optimal supply function for the entire day. Then I compare the resulting supply function with actual day-ahead offers in New York. In addition to supply function equilibrium, I also develop a conservative bidding approach in which each firm assumes that rivals bid at marginal costs. Results show that the supply functions derived from equilibrium bidding model in this paper is not consistent with actual bidding in New York. This result is mainly driven by the class of supply functions used in this study to generate the equilibrium. Further, actual offers do not resemble offers generated by the conservative bidding algorithm.
Ecological and evolutionary genomics of marine photosynthetic organisms.
Coelho, Susana M; Simon, Nathalie; Ahmed, Sophia; Cock, J Mark; Partensky, Frédéric
2013-02-01
Environmental (ecological) genomics aims to understand the genetic basis of relationships between organisms and their abiotic and biotic environments. It is a rapidly progressing field of research largely due to recent advances in the speed and volume of genomic data being produced by next generation sequencing (NGS) technologies. Building on information generated by NGS-based approaches, functional genomic methodologies are being applied to identify and characterize genes and gene systems of both environmental and evolutionary relevance. Marine photosynthetic organisms (MPOs) were poorly represented amongst the early genomic models, but this situation is changing rapidly. Here we provide an overview of the recent advances in the application of ecological genomic approaches to both prokaryotic and eukaryotic MPOs. We describe how these approaches are being used to explore the biology and ecology of marine cyanobacteria and algae, particularly with regard to their functions in a broad range of marine ecosystems. Specifically, we review the ecological and evolutionary insights gained from whole genome and transcriptome sequencing projects applied to MPOs and illustrate how their genomes are yielding information on the specific features of these organisms. © 2012 Blackwell Publishing Ltd.
Efficient 3D porous microstructure reconstruction via Gaussian random field and hybrid optimization.
Jiang, Z; Chen, W; Burkhart, C
2013-11-01
Obtaining an accurate three-dimensional (3D) structure of a porous microstructure is important for assessing the material properties based on finite element analysis. Whereas directly obtaining 3D images of the microstructure is impractical under many circumstances, two sets of methods have been developed in literature to generate (reconstruct) 3D microstructure from its 2D images: one characterizes the microstructure based on certain statistical descriptors, typically two-point correlation function and cluster correlation function, and then performs an optimization process to build a 3D structure that matches those statistical descriptors; the other method models the microstructure using stochastic models like a Gaussian random field and generates a 3D structure directly from the function. The former obtains a relatively accurate 3D microstructure, but computationally the optimization process can be very intensive, especially for problems with large image size; the latter generates a 3D microstructure quickly but sacrifices the accuracy due to issues in numerical implementations. A hybrid optimization approach of modelling the 3D porous microstructure of random isotropic two-phase materials is proposed in this paper, which combines the two sets of methods and hence maintains the accuracy of the correlation-based method with improved efficiency. The proposed technique is verified for 3D reconstructions based on silica polymer composite images with different volume fractions. A comparison of the reconstructed microstructures and the optimization histories for both the original correlation-based method and our hybrid approach demonstrates the improved efficiency of the approach. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Advanced Unstructured Grid Generation for Complex Aerodynamic Applications
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new approach for distribution of grid points on the surface and in the volume has been developed and implemented in the NASA unstructured grid generation code VGRID. In addition to the point and line sources of prior work, the new approach utilizes surface and volume sources for automatic curvature-based grid sizing and convenient point distribution in the volume. A new exponential growth function produces smoother and more efficient grids and provides superior control over distribution of grid points in the field. All types of sources support anisotropic grid stretching which not only improves the grid economy but also provides more accurate solutions for certain aerodynamic applications. The new approach does not require a three-dimensional background grid as in the previous methods. Instead, it makes use of an efficient bounding-box auxiliary medium for storing grid parameters defined by surface sources. The new approach is less memory-intensive and more efficient computationally. The grids generated with the new method either eliminate the need for adaptive grid refinement for certain class of problems or provide high quality initial grids that would enhance the performance of many adaptation methods.
Labelling Polymers and Micellar Nanoparticles via Initiation, Propagation and Termination with ROMP
Thompson, Matthew P.; Randolph, Lyndsay M.; James, Carrie R.; Davalos, Ashley N.; Hahn, Michael E.
2014-01-01
In this paper we compare and contrast three approaches for labelling polymers with functional groups via ring-opening metathesis polymerization (ROMP). We explored the incorporation of functionality via initiation, termination and propagation employing an array of novel initiators, termination agents and monomers. The goal was to allow the generation of selectively labelled and well-defined polymers that would in turn lead to the formation of labelled nanomaterials. Norbornene analogues, prepared as functionalized monomers for ROMP, included fluorescent dyes (rhodamine, fluorescein, EDANS, and coumarin), quenchers (DABCYL), conjugatable moieties (NHS esters, pentafluorophenyl esters), and protected amines. In addition, a set of symmetrical olefins for terminally labelling polymers, and for the generation of initiators in situ is described. PMID:24855496
The impact of new-generation physicians on the function of academic anesthesiology departments.
Kapur, Patricia A
2007-12-01
Academic departments of anesthesiology have had to adapt a wide variety of clinical and educational work functions to the viewpoints, values and normative expectations of the newer generations of physicians who now present themselves for training as well as for faculty employment. This commentary will elaborate on key points that academic departments must recognize and incorporate into their functional and organizational imperatives in order to remain successful with regard to physician recruitment and retention. Recognition of differences between newer-generation vs. established physician issues and concerns include differences in: learning style, teaching style, approach to clinical schedules and the concept of life-work balance, academic and personal motivation, desire for control of their work experience, effective productivity incentives, as well as communication style issues and implications thereof. The spectrum of physicians who contribute to the impact of these factors on contemporary academic anesthesiology departments include faculty, nonfaculty staff physicians, residents and medical students. Academic departments of anesthesiology which can successfully incorporate the changes and most importantly the functional and organizational flexibility needed to respond to the newer generations' worldview and so-called balanced goals will be able to best attract high-caliber housestaff and future faculty.
Two-particle correlation function and dihadron correlation approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vechernin, V. V., E-mail: v.vechernin@spbu.ru; Ivanov, K. O.; Neverov, D. I.
It is shown that, in the case of asymmetric nuclear interactions, the application of the traditional dihadron correlation approach to determining a two-particle correlation function C may lead to a form distorted in relation to the canonical pair correlation function {sub C}{sup 2}. This result was obtained both by means of exact analytic calculations of correlation functions within a simple string model for proton–nucleus and deuteron–nucleus collisions and by means of Monte Carlo simulations based on employing the HIJING event generator. It is also shown that the method based on studying multiplicity correlations in two narrow observation windows separated inmore » rapidity makes it possible to determine correctly the canonical pair correlation function C{sub 2} for all cases, including the case where the rapidity distribution of product particles is not uniform.« less
Automotive Radar and Lidar Systems for Next Generation Driver Assistance Functions
NASA Astrophysics Data System (ADS)
Rasshofer, R. H.; Gresser, K.
2005-05-01
Automotive radar and lidar sensors represent key components for next generation driver assistance functions (Jones, 2001). Today, their use is limited to comfort applications in premium segment vehicles although an evolution process towards more safety-oriented functions is taking place. Radar sensors available on the market today suffer from low angular resolution and poor target detection in medium ranges (30 to 60m) over azimuth angles larger than ±30°. In contrast, Lidar sensors show large sensitivity towards environmental influences (e.g. snow, fog, dirt). Both sensor technologies today have a rather high cost level, forbidding their wide-spread usage on mass markets. A common approach to overcome individual sensor drawbacks is the employment of data fusion techniques (Bar-Shalom, 2001). Raw data fusion requires a common, standardized data interface to easily integrate a variety of asynchronous sensor data into a fusion network. Moreover, next generation sensors should be able to dynamically adopt to new situations and should have the ability to work in cooperative sensor environments. As vehicular function development today is being shifted more and more towards virtual prototyping, mathematical sensor models should be available. These models should take into account the sensor's functional principle as well as all typical measurement errors generated by the sensor.
A CellML simulation compiler and code generator using ODE solving schemes
2012-01-01
Models written in description languages such as CellML are becoming a popular solution to the handling of complex cellular physiological models in biological function simulations. However, in order to fully simulate a model, boundary conditions and ordinary differential equation (ODE) solving schemes have to be combined with it. Though boundary conditions can be described in CellML, it is difficult to explicitly specify ODE solving schemes using existing tools. In this study, we define an ODE solving scheme description language-based on XML and propose a code generation system for biological function simulations. In the proposed system, biological simulation programs using various ODE solving schemes can be easily generated. We designed a two-stage approach where the system generates the equation set associating the physiological model variable values at a certain time t with values at t + Δt in the first stage. The second stage generates the simulation code for the model. This approach enables the flexible construction of code generation modules that can support complex sets of formulas. We evaluate the relationship between models and their calculation accuracies by simulating complex biological models using various ODE solving schemes. Using the FHN model simulation, results showed good qualitative and quantitative correspondence with the theoretical predictions. Results for the Luo-Rudy 1991 model showed that only first order precision was achieved. In addition, running the generated code in parallel on a GPU made it possible to speed up the calculation time by a factor of 50. The CellML Compiler source code is available for download at http://sourceforge.net/projects/cellmlcompiler. PMID:23083065
Programming function into mechanical forms by directed assembly of silk bulk materials
Patel, Nereus; Duggan, Thomas; Perotto, Giovanni; Shirman, Elijah; Li, Chunmei; Kaplan, David L.; Omenetto, Fiorenzo G.
2017-01-01
We report simple, water-based fabrication methods based on protein self-assembly to generate 3D silk fibroin bulk materials that can be easily hybridized with water-soluble molecules to obtain multiple solid formats with predesigned functions. Controlling self-assembly leads to robust, machinable formats that exhibit thermoplastic behavior consenting material reshaping at the nanoscale, microscale, and macroscale. We illustrate the versatility of the approach by realizing demonstrator devices where large silk monoliths can be generated, polished, and reshaped into functional mechanical components that can be nanopatterned, embed optical function, heated on demand in response to infrared light, or can visualize mechanical failure through colorimetric chemistries embedded in the assembled (bulk) protein matrix. Finally, we show an enzyme-loaded solid mechanical part, illustrating the ability to incorporate biological function within the bulk material with possible utility for sustained release in robust, programmably shapeable mechanical formats. PMID:28028213
PDF approach for turbulent scalar field: Some recent developments
NASA Technical Reports Server (NTRS)
Gao, Feng
1993-01-01
The probability density function (PDF) method has been proven a very useful approach in turbulence research. It has been particularly effective in simulating turbulent reacting flows and in studying some detailed statistical properties generated by a turbulent field There are, however, some important questions that have yet to be answered in PDF studies. Our efforts in the past year have been focused on two areas. First, a simple mixing model suitable for Monte Carlo simulations has been developed based on the mapping closure. Secondly, the mechanism of turbulent transport has been analyzed in order to understand the recently observed abnormal PDF's of turbulent temperature fields generated by linear heat sources.
Bright, T J
2013-01-01
Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). THE APPROACH CONSISTED OF FIVE STEPS: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains.
Bright, T.J.
2013-01-01
Summary Background Many informatics studies use content analysis to generate functional requirements for system development. Explication of this translational process from qualitative data to functional requirements can strengthen the understanding and scientific rigor when applying content analysis in informatics studies. Objective To describe a user-centered approach transforming emergent themes derived from focus group data into functional requirements for informatics solutions and to illustrate these methods to the development of an antibiotic clinical decision support system (CDS). Methods The approach consisted of five steps: 1) identify unmet therapeutic planning information needs via Focus Group Study-I, 2) develop a coding framework of therapeutic planning themes to refine the domain scope to antibiotic therapeutic planning, 3) identify functional requirements of an antibiotic CDS system via Focus Group Study-II, 4) discover informatics solutions and functional requirements from coded data, and 5) determine the types of information needed to support the antibiotic CDS system and link with the identified informatics solutions and functional requirements. Results The coding framework for Focus Group Study-I revealed unmet therapeutic planning needs. Twelve subthemes emerged and were clustered into four themes; analysis indicated a need for an antibiotic CDS intervention. Focus Group Study-II included five types of information needs. Comments from the Barrier/Challenge to information access and Function/Feature themes produced three informatics solutions and 13 functional requirements of an antibiotic CDS system. Comments from the Patient, Institution, and Domain themes generated required data elements for each informatics solution. Conclusion This study presents one example explicating content analysis of focus group data and the analysis process to functional requirements from narrative data. Illustration of this 5-step method was used to develop an antibiotic CDS system, resolving unmet antibiotic prescribing needs. As a reusable approach, these techniques can be refined and applied to resolve unmet information needs with informatics interventions in additional domains. PMID:24454586
Initial Results from the Variable Intensity Sonic Boom Database
NASA Technical Reports Server (NTRS)
Haering, Edward A., Jr.; Cliatt, Larry J., II; Gabrielson, Thomas; Sparrow, Victor W.; Locey, Lance L.; Bunce, Thomas J.
2008-01-01
43 sonic booms generated (a few were evanescent waves) a) Overpressures of 0.08 to 2.20 lbf/sq ft; b) Rise-times of about 0.7 to 50 ms. Objectives: a) Structural response of a house of modern construction; b) Sonic boom propagation code validation. Approach: a) Measure shockwave directionality; b) Determine effect of height above ground on acoustic level; c) Generate atmospheric turbulence filter functions.
Rotational control of computer generated holograms.
Preece, Daryl; Rubinsztein-Dunlop, Halina
2017-11-15
We develop a basis for three-dimensional rotation of arbitrary light fields created by computer generated holograms. By adding an extra phase function into the kinoform, any light field or holographic image can be tilted in the focal plane with minimized distortion. We present two different approaches to rotate an arbitrary hologram: the Scheimpflug method and a novel coordinate transformation method. Experimental results are presented to demonstrate the validity of both proposed methods.
Generating series for GUE correlators
NASA Astrophysics Data System (ADS)
Dubrovin, Boris; Yang, Di
2017-11-01
We extend to the Toda lattice hierarchy the approach of Bertola et al. (Phys D Nonlinear Phenom 327:30-57, 2016; IMRN, 2016) to computation of logarithmic derivatives of tau-functions in terms of the so-called matrix resolvents of the corresponding difference Lax operator. As a particular application we obtain explicit generating series for connected GUE correlators. On this basis an efficient recursive procedure for computing the correlators in full genera is developed.
Third-generation intelligent IR focal plane arrays
NASA Astrophysics Data System (ADS)
Caulfield, H. John; Jack, Michael D.; Pettijohn, Kevin L.; Schlesselmann, John D.; Norworth, Joe
1998-03-01
SBRC is at the forefront of industry in developing IR focal plane arrays including multi-spectral technology and '3rd generation' functions that mimic the human eye. 3rd generation devices conduct advanced processing on or near the FPA that serve to reduce bandwidth while performing needed functions such as automatic target recognition, uniformity correction and dynamic range enhancement. These devices represent a solution for processing the exorbitantly high bandwidth coming off large area FPAs without sacrificing systems sensitivity. SBRC's two-color approach leverages the company's HgCdTe technology to provide simultaneous multiband coverage, from short through long wave IR, with near theoretical performance. IR systems that are sensitive to different spectral bands achieve enhanced capabilities for target identification and advanced discrimination. This paper will provide a summary of the issues, the technology and the benefits of SBRC's third generation smart and two-color FPAs.
Functional identification of spike-processing neural circuits.
Lazar, Aurel A; Slutskiy, Yevgeniy B
2014-02-01
We introduce a novel approach for a complete functional identification of biophysical spike-processing neural circuits. The circuits considered accept multidimensional spike trains as their input and comprise a multitude of temporal receptive fields and conductance-based models of action potential generation. Each temporal receptive field describes the spatiotemporal contribution of all synapses between any two neurons and incorporates the (passive) processing carried out by the dendritic tree. The aggregate dendritic current produced by a multitude of temporal receptive fields is encoded into a sequence of action potentials by a spike generator modeled as a nonlinear dynamical system. Our approach builds on the observation that during any experiment, an entire neural circuit, including its receptive fields and biophysical spike generators, is projected onto the space of stimuli used to identify the circuit. Employing the reproducing kernel Hilbert space (RKHS) of trigonometric polynomials to describe input stimuli, we quantitatively describe the relationship between underlying circuit parameters and their projections. We also derive experimental conditions under which these projections converge to the true parameters. In doing so, we achieve the mathematical tractability needed to characterize the biophysical spike generator and identify the multitude of receptive fields. The algorithms obviate the need to repeat experiments in order to compute the neurons' rate of response, rendering our methodology of interest to both experimental and theoretical neuroscientists.
Lehmann, Jason S.; Matthias, Michael A.; Vinetz, Joseph M.; Fouts, Derrick E.
2014-01-01
Leptospirosis, caused by pathogenic spirochetes belonging to the genus Leptospira, is a zoonosis with important impacts on human and animal health worldwide. Research on the mechanisms of Leptospira pathogenesis has been hindered due to slow growth of infectious strains, poor transformability, and a paucity of genetic tools. As a result of second generation sequencing technologies, there has been an acceleration of leptospiral genome sequencing efforts in the past decade, which has enabled a concomitant increase in functional genomics analyses of Leptospira pathogenesis. A pathogenomics approach, by coupling of pan-genomic analysis of multiple isolates with sequencing of experimentally attenuated highly pathogenic Leptospira, has resulted in the functional inference of virulence factors. The global Leptospira Genome Project supported by the U.S. National Institute of Allergy and Infectious Diseases to which key scientific contributions have been made from the international leptospirosis research community has provided a new roadmap for comprehensive studies of Leptospira and leptospirosis well into the future. This review describes functional genomics approaches to apply the data generated by the Leptospira Genome Project towards deepening our knowledge of virulence factors of Leptospira using the emerging discipline of pathogenomics. PMID:25437801
The fruits of a functional approach for psychological science.
Stewart, Ian
2016-02-01
The current paper introduces relational frame theory (RFT) as a functional contextual approach to complex human behaviour and examines how this theory has contributed to our understanding of several key phenomena in psychological science. I will first briefly outline the philosophical foundation of RFT and then examine its conceptual basis and core concepts. Thereafter, I provide an overview of the empirical findings and applications that RFT has stimulated in a number of key domains such as language development, linguistic generativity, rule-following, analogical reasoning, intelligence, theory of mind, psychopathology and implicit cognition. © 2015 International Union of Psychological Science.
Fei, Xiang; Zavorka, Megan E; Malik, Guillaume; Connelly, Christopher M; MacDonald, Richard G; Berkowitz, David B
2017-08-18
A generalized strategy is presented for the rapid assembly of a set of bivalent ligands with a variety of linking functionalities from a common monomer. Herein, an array of phosphatase-inert mannose-6-phosphonate-presenting ligands for the cation-independent-mannose 6-phosphate receptor (CI-MPR) is constructed. Receptor binding affinity varies with linking functionality-the simple amide and 1,5-triazole(tetrazole) being preferred over the 1,4-triazole. This approach is expected to find application across chemical biology, particularly in glycoscience, wherein multivalency often governs molecular recognition.
Small molecule-induced cellular fate reprogramming: promising road leading to Rome.
Li, Xiang; Xu, Jun; Deng, Hongkui
2018-05-29
Cellular fate reprogramming holds great promise to generate functional cell types for replenishing new cells and restoring functional loss. Inspired by transcription factor-induced reprogramming, the field of cellular reprogramming has greatly advanced and developed into divergent streams of reprogramming approaches. Remarkably, increasing studies have shown the power and advantages of small molecule-based approaches for cellular fate reprogramming, which could overcome the limitations of conventional transgenic-based reprogramming. In this concise review, we discuss these findings and highlight the future potentiality with particular focus on this new trend of chemical reprogramming. Copyright © 2018 Elsevier Ltd. All rights reserved.
Trait-based approaches for understanding microbial biodiversity and ecosystem functioning
Krause, Sascha; Le Roux, Xavier; Niklaus, Pascal A.; Van Bodegom, Peter M.; Lennon, Jay T.; Bertilsson, Stefan; Grossart, Hans-Peter; Philippot, Laurent; Bodelier, Paul L. E.
2014-01-01
In ecology, biodiversity-ecosystem functioning (BEF) research has seen a shift in perspective from taxonomy to function in the last two decades, with successful application of trait-based approaches. This shift offers opportunities for a deeper mechanistic understanding of the role of biodiversity in maintaining multiple ecosystem processes and services. In this paper, we highlight studies that have focused on BEF of microbial communities with an emphasis on integrating trait-based approaches to microbial ecology. In doing so, we explore some of the inherent challenges and opportunities of understanding BEF using microbial systems. For example, microbial biologists characterize communities using gene phylogenies that are often unable to resolve functional traits. Additionally, experimental designs of existing microbial BEF studies are often inadequate to unravel BEF relationships. We argue that combining eco-physiological studies with contemporary molecular tools in a trait-based framework can reinforce our ability to link microbial diversity to ecosystem processes. We conclude that such trait-based approaches are a promising framework to increase the understanding of microbial BEF relationships and thus generating systematic principles in microbial ecology and more generally ecology. PMID:24904563
Functional Cellular Mimics for the Spatiotemporal Control of Multiple Enzymatic Cascade Reactions.
Liu, Xiaoling; Formanek, Petr; Voit, Brigitte; Appelhans, Dietmar
2017-12-18
Next-generation therapeutic approaches are expected to rely on the engineering of biomimetic cellular systems that can mimic specific cellular functions. Herein, we demonstrate a highly effective route for constructing structural and functional eukaryotic cell mimics by loading pH-sensitive polymersomes as membrane-associated and free-floating organelle mimics inside the multifunctional cell membrane. Metabolism mimicry has been validated by performing successive enzymatic cascade reactions spatially separated at specific sites of cell mimics in the presence and absence of extracellular organelle mimics. These enzymatic reactions take place in a highly controllable, reproducible, efficient, and successive manner. Our biomimetic approach to material design for establishing functional principles brings considerable enrichment to the fields of biomedicine, biocatalysis, biotechnology, and systems biology. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Renewable Chemicals: Dehydroxylation of Glycerol and Polyols
ten Dam, Jeroen; Hanefeld, Ulf
2011-01-01
The production of renewable chemicals is gaining attention over the past few years. The natural resources from which they can be derived in a sustainable way are most abundant in sugars, cellulose and hemicellulose. These highly functionalized molecules need to be de-functionalized in order to be feedstocks for the chemical industry. A fundamentally different approach to chemistry thus becomes necessary, since the traditionally employed oil-based chemicals normally lack functionality. This new chemical toolbox needs to be designed to guarantee the demands of future generations at a reasonable price. The surplus of functionality in sugars and glycerol consists of alcohol groups. To yield suitable renewable chemicals these natural products need to be defunctionalized by means of dehydroxylation. Here we review the possible approaches and evaluate them from a fundamental chemical aspect. PMID:21887771
Dhanyalakshmi, K H; Naika, Mahantesha B N; Sajeevan, R S; Mathew, Oommen K; Shafi, K Mohamed; Sowdhamini, Ramanathan; N Nataraja, Karaba
2016-01-01
The modern sequencing technologies are generating large volumes of information at the transcriptome and genome level. Translation of this information into a biological meaning is far behind the race due to which a significant portion of proteins discovered remain as proteins of unknown function (PUFs). Attempts to uncover the functional significance of PUFs are limited due to lack of easy and high throughput functional annotation tools. Here, we report an approach to assign putative functions to PUFs, identified in the transcriptome of mulberry, a perennial tree commonly cultivated as host of silkworm. We utilized the mulberry PUFs generated from leaf tissues exposed to drought stress at whole plant level. A sequence and structure based computational analysis predicted the probable function of the PUFs. For rapid and easy annotation of PUFs, we developed an automated pipeline by integrating diverse bioinformatics tools, designated as PUFs Annotation Server (PUFAS), which also provides a web service API (Application Programming Interface) for a large-scale analysis up to a genome. The expression analysis of three selected PUFs annotated by the pipeline revealed abiotic stress responsiveness of the genes, and hence their potential role in stress acclimation pathways. The automated pipeline developed here could be extended to assign functions to PUFs from any organism in general. PUFAS web server is available at http://caps.ncbs.res.in/pufas/ and the web service is accessible at http://capservices.ncbs.res.in/help/pufas.
Fujita, Yuki; Ishikawa, Junya; Furuta, Hiroyuki; Ikawa, Yoshiya
2010-08-26
In vitro selection with long random RNA libraries has been used as a powerful method to generate novel functional RNAs, although it often requires laborious structural analysis of isolated RNA molecules. Rational RNA design is an attractive alternative to avoid this laborious step, but rational design of catalytic modules is still a challenging task. A hybrid strategy of in vitro selection and rational design has been proposed. With this strategy termed "design and selection," new ribozymes can be generated through installation of catalytic modules onto RNA scaffolds with defined 3D structures. This approach, the concept of which was inspired by the modular architecture of naturally occurring ribozymes, allows prediction of the overall architectures of the resulting ribozymes, and the structural modularity of the resulting ribozymes allows modification of their structures and functions. In this review, we summarize the design, generation, properties, and engineering of four classes of ligase ribozyme generated by design and selection.
A Response Function Approach for Rapid Far-Field Tsunami Forecasting
NASA Astrophysics Data System (ADS)
Tolkova, Elena; Nicolsky, Dmitry; Wang, Dailin
2017-08-01
Predicting tsunami impacts at remote coasts largely relies on tsunami en-route measurements in an open ocean. In this work, these measurements are used to generate instant tsunami predictions in deep water and near the coast. The predictions are generated as a response or a combination of responses to one or more tsunameters, with each response obtained as a convolution of real-time tsunameter measurements and a pre-computed pulse response function (PRF). Practical implementation of this method requires tables of PRFs in a 3D parameter space: earthquake location-tsunameter-forecasted site. Examples of hindcasting the 2010 Chilean and the 2011 Tohoku-Oki tsunamis along the US West Coast and beyond demonstrated high accuracy of the suggested technology in application to trans-Pacific seismically generated tsunamis.
PATtyFams: Protein families for the microbial genomes in the PATRIC database
Davis, James J.; Gerdes, Svetlana; Olsen, Gary J.; ...
2016-02-08
The ability to build accurate protein families is a fundamental operation in bioinformatics that influences comparative analyses, genome annotation, and metabolic modeling. For several years we have been maintaining protein families for all microbial genomes in the PATRIC database (Pathosystems Resource Integration Center, patricbrc.org) in order to drive many of the comparative analysis tools that are available through the PATRIC website. However, due to the burgeoning number of genomes, traditional approaches for generating protein families are becoming prohibitive. In this report, we describe a new approach for generating protein families, which we call PATtyFams. This method uses the k-mer-based functionmore » assignments available through RAST (Rapid Annotation using Subsystem Technology) to rapidly guide family formation, and then differentiates the function-based groups into families using a Markov Cluster algorithm (MCL). In conclusion, this new approach for generating protein families is rapid, scalable and has properties that are consistent with alignment-based methods.« less
PATtyFams: Protein families for the microbial genomes in the PATRIC database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, James J.; Gerdes, Svetlana; Olsen, Gary J.
The ability to build accurate protein families is a fundamental operation in bioinformatics that influences comparative analyses, genome annotation, and metabolic modeling. For several years we have been maintaining protein families for all microbial genomes in the PATRIC database (Pathosystems Resource Integration Center, patricbrc.org) in order to drive many of the comparative analysis tools that are available through the PATRIC website. However, due to the burgeoning number of genomes, traditional approaches for generating protein families are becoming prohibitive. In this report, we describe a new approach for generating protein families, which we call PATtyFams. This method uses the k-mer-based functionmore » assignments available through RAST (Rapid Annotation using Subsystem Technology) to rapidly guide family formation, and then differentiates the function-based groups into families using a Markov Cluster algorithm (MCL). In conclusion, this new approach for generating protein families is rapid, scalable and has properties that are consistent with alignment-based methods.« less
Linear-scaling generation of potential energy surfaces using a double incremental expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
König, Carolin, E-mail: carolink@kth.se; Christiansen, Ove, E-mail: ove@chem.au.dk
We present a combination of the incremental expansion of potential energy surfaces (PESs), known as n-mode expansion, with the incremental evaluation of the electronic energy in a many-body approach. The application of semi-local coordinates in this context allows the generation of PESs in a very cost-efficient way. For this, we employ the recently introduced flexible adaptation of local coordinates of nuclei (FALCON) coordinates. By introducing an additional transformation step, concerning only a fraction of the vibrational degrees of freedom, we can achieve linear scaling of the accumulated cost of the single point calculations required in the PES generation. Numerical examplesmore » of these double incremental approaches for oligo-phenyl examples show fast convergence with respect to the maximum number of simultaneously treated fragments and only a modest error introduced by the additional transformation step. The approach, presented here, represents a major step towards the applicability of vibrational wave function methods to sizable, covalently bound systems.« less
Parsing the Role of the Hippocampus in Approach-Avoidance Conflict.
Loh, Eleanor; Kurth-Nelson, Zeb; Berron, David; Dayan, Peter; Duzel, Emrah; Dolan, Ray; Guitart-Masip, Marc
2017-01-01
The hippocampus plays a central role in the approach-avoidance conflict that is central to the genesis of anxiety. However, its exact functional contribution has yet to be identified. We designed a novel gambling task that generated approach-avoidance conflict while controlling for spatial processing. We fit subjects' behavior using a model that quantified the subjective values of choice options, and recorded neural signals using functional magnetic resonance imaging (fMRI). Distinct functional signals were observed in anterior hippocampus, with inferior hippocampus selectively recruited when subjects rejected a gamble, to a degree that covaried with individual differences in anxiety. The superior anterior hippocampus, in contrast, uniquely demonstrated value signals that were potentiated in the context of approach-avoidance conflict. These results implicate the anterior hippocampus in behavioral avoidance and choice monitoring, in a manner relevant to understanding its role in anxiety. Our findings highlight interactions between subregions of the hippocampus as an important focus for future study. © The Author 2016. Published by Oxford University Press.
Optimizing Monitoring Designs under Alternative Objectives
Gastelum, Jason A.; USA, Richland Washington; Porter, Ellen A.; ...
2014-12-31
This paper describes an approach to identify monitoring designs that optimize detection of CO2 leakage from a carbon capture and sequestration (CCS) reservoir and compares the results generated under two alternative objective functions. The first objective function minimizes the expected time to first detection of CO2 leakage, the second more conservative objective function minimizes the maximum time to leakage detection across the set of realizations. The approach applies a simulated annealing algorithm that searches the solution space by iteratively mutating the incumbent monitoring design. The approach takes into account uncertainty by evaluating the performance of potential monitoring designs across amore » set of simulated leakage realizations. The approach relies on a flexible two-tiered signature to infer that CO2 leakage has occurred. This research is part of the National Risk Assessment Partnership, a U.S. Department of Energy (DOE) project tasked with conducting risk and uncertainty analysis in the areas of reservoir performance, natural leakage pathways, wellbore integrity, groundwater protection, monitoring, and systems level modeling.« less
MOCASSIN-prot: A multi-objective clustering approach for protein similarity networks
USDA-ARS?s Scientific Manuscript database
Motivation: Proteins often include multiple conserved domains. Various evolutionary events including duplication and loss of domains, domain shuffling, as well as sequence divergence contribute to generating complexities in protein structures, and consequently, in their functions. The evolutionary h...
Designing and Testing Functional RNA Nanoparticles | Center for Cancer Research
Recent advances in nanotechnology have generated excitement that nanomaterials may provide novel approaches for the diagnosis and treatment of deadly diseases, such as cancer. However, the use of synthetic materials to generate nanoparticles can present challenges with endotoxin content, sterility, or biocompatibility. Employing biological materials may overcome these issues with RNA being particularly attractive given the clinical applications of RNA interference and the abundance of functional RNAs, including aptamers and ribozymes. RNA can form stable three-dimensional nanoparticle structures that can be decorated with other nucleic acids, small molecules, or proteins, potentially increasing local concentrations of therapeutic agents and acting synergistically when combined.
Generating functionals and Gaussian approximations for interruptible delay reactions
NASA Astrophysics Data System (ADS)
Brett, Tobias; Galla, Tobias
2015-10-01
We develop a generating functional description of the dynamics of non-Markovian individual-based systems in which delay reactions can be terminated before completion. This generalizes previous work in which a path-integral approach was applied to dynamics in which delay reactions complete with certainty. We construct a more widely applicable theory, and from it we derive Gaussian approximations of the dynamics, valid in the limit of large, but finite, population sizes. As an application of our theory we study predator-prey models with delay dynamics due to gestation or lag periods to reach the reproductive age. In particular, we focus on the effects of delay on noise-induced cycles.
NASA Astrophysics Data System (ADS)
Neves, J. C. S.
2017-06-01
In this work, we have deformed regular black holes which possess a general mass term described by a function which generalizes the Bardeen and Hayward mass functions. By using linear constraints in the energy-momentum tensor to generate metrics, the solutions presented in this work are either regular or singular. That is, within this approach, it is possible to generate regular or singular black holes from regular or singular black holes. Moreover, contrary to the Bardeen and Hayward regular solutions, the deformed regular black holes may violate the weak energy condition despite the presence of the spherical symmetry. Some comments on accretion of deformed black holes in cosmological scenarios are made.
Ciric, Milica; Moon, Christina D; Leahy, Sinead C; Creevey, Christopher J; Altermann, Eric; Attwood, Graeme T; Rakonjac, Jasna; Gagic, Dragana
2014-05-12
In silico, secretome proteins can be predicted from completely sequenced genomes using various available algorithms that identify membrane-targeting sequences. For metasecretome (collection of surface, secreted and transmembrane proteins from environmental microbial communities) this approach is impractical, considering that the metasecretome open reading frames (ORFs) comprise only 10% to 30% of total metagenome, and are poorly represented in the dataset due to overall low coverage of metagenomic gene pool, even in large-scale projects. By combining secretome-selective phage display and next-generation sequencing, we focused the sequence analysis of complex rumen microbial community on the metasecretome component of the metagenome. This approach achieved high enrichment (29 fold) of secreted fibrolytic enzymes from the plant-adherent microbial community of the bovine rumen. In particular, we identified hundreds of heretofore rare modules belonging to cellulosomes, cell-surface complexes specialised for recognition and degradation of the plant fibre. As a method, metasecretome phage display combined with next-generation sequencing has a power to sample the diversity of low-abundance surface and secreted proteins that would otherwise require exceptionally large metagenomic sequencing projects. As a resource, metasecretome display library backed by the dataset obtained by next-generation sequencing is ready for i) affinity selection by standard phage display methodology and ii) easy purification of displayed proteins as part of the virion for individual functional analysis.
Min, Chang; Sanchawala, Abbas; Seidel, Daniel
2014-05-16
Iminium ions generated in situ via copper(I) bromide catalyzed oxidation of N-aryl amines readily undergo [4 + 2] cycloadditions with a range of dienophiles. This method involves the functionalization of both a C(sp(3))-H and a C(sp(2))-H bond and enables the rapid construction of polycyclic amines under relatively mild conditions.
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
CRISPR-Cas9 and CRISPR-Cpf1 mediated targeting of a stomatal developmental gene EPFL9 in rice.
Yin, Xiaojia; Biswal, Akshaya K; Dionora, Jacqueline; Perdigon, Kristel M; Balahadia, Christian P; Mazumdar, Shamik; Chater, Caspar; Lin, Hsiang-Chun; Coe, Robert A; Kretzschmar, Tobias; Gray, Julie E; Quick, Paul W; Bandyopadhyay, Anindya
2017-05-01
CRISPR-Cas9/Cpf1 system with its unique gene targeting efficiency, could be an important tool for functional study of early developmental genes through the generation of successful knockout plants. The introduction and utilization of systems biology approaches have identified several genes that are involved in early development of a plant and with such knowledge a robust tool is required for the functional validation of putative candidate genes thus obtained. The development of the CRISPR-Cas9/Cpf1 genome editing system has provided a convenient tool for creating loss of function mutants for genes of interest. The present study utilized CRISPR/Cas9 and CRISPR-Cpf1 technology to knock out an early developmental gene EPFL9 (Epidermal Patterning Factor like-9, a positive regulator of stomatal development in Arabidopsis) orthologue in rice. Germ-line mutants that were generated showed edits that were carried forward into the T2 generation when Cas9-free homozygous mutants were obtained. The homozygous mutant plants showed more than an eightfold reduction in stomatal density on the abaxial leaf surface of the edited rice plants. Potential off-target analysis showed no significant off-target effects. This study also utilized the CRISPR-LbCpf1 (Lachnospiracae bacterium Cpf1) to target the same OsEPFL9 gene to test the activity of this class-2 CRISPR system in rice and found that Cpf1 is also capable of genome editing and edits get transmitted through generations with similar phenotypic changes seen with CRISPR-Cas9. This study demonstrates the application of CRISPR-Cas9/Cpf1 to precisely target genomic locations and develop transgene-free homozygous heritable gene edits and confirms that the loss of function analysis of the candidate genes emerging from different systems biology based approaches, could be performed, and therefore, this system adds value in the validation of gene function studies.
Co-evolution for Problem Simplification
NASA Technical Reports Server (NTRS)
Haith, Gary L.; Lohn, Jason D.; Cplombano, Silvano P.; Stassinopoulos, Dimitris
1999-01-01
This paper explores a co-evolutionary approach applicable to difficult problems with limited failure/success performance feedback. Like familiar "predator-prey" frameworks this algorithm evolves two populations of individuals - the solutions (predators) and the problems (prey). The approach extends previous work by rewarding only the problems that match their difficulty to the level of solut,ion competence. In complex problem domains with limited feedback, this "tractability constraint" helps provide an adaptive fitness gradient that, effectively differentiates the candidate solutions. The algorithm generates selective pressure toward the evolution of increasingly competent solutions by rewarding solution generality and uniqueness and problem tractability and difficulty. Relative (inverse-fitness) and absolute (static objective function) approaches to evaluating problem difficulty are explored and discussed. On a simple control task, this co-evolutionary algorithm was found to have significant advantages over a genetic algorithm with either a static fitness function or a fitness function that changes on a hand-tuned schedule.
Complementary DNA libraries: an overview.
Ying, Shao-Yao
2004-07-01
The generation of complete and full-length cDNA libraries for potential functional assays of specific gene sequences is essential for most molecules in biotechnology and biomedical research. The field of cDNA library generation has changed rapidly in the past 10 yr. This review presents an overview of the method available for the basic information of generating cDNA libraries, including the definition of the cDNA library, different kinds of cDNA libraries, difference between methods for cDNA library generation using conventional approaches and a novel strategy, and the quality of cDNA libraries. It is anticipated that the high-quality cDNA libraries so generated would facilitate studies involving genechips and the microarray, differential display, subtractive hybridization, gene cloning, and peptide library generation.
Nonlinear optical interactions in silicon waveguides
NASA Astrophysics Data System (ADS)
Kuyken, B.; Leo, F.; Clemmen, S.; Dave, U.; Van Laer, R.; Ideguchi, T.; Zhao, H.; Liu, X.; Safioui, J.; Coen, S.; Gorza, S. P.; Selvaraja, S. K.; Massar, S.; Osgood, R. M.; Verheyen, P.; Van Campenhout, J.; Baets, R.; Green, W. M. J.; Roelkens, G.
2017-03-01
The strong nonlinear response of silicon photonic nanowire waveguides allows for the integration of nonlinear optical functions on a chip. However, the detrimental nonlinear optical absorption in silicon at telecom wavelengths limits the efficiency of many such experiments. In this review, several approaches are proposed and demonstrated to overcome this fundamental issue. By using the proposed methods, we demonstrate amongst others supercontinuum generation, frequency comb generation, a parametric optical amplifier, and a parametric optical oscillator.
Active cell mechanics: Measurement and theory.
Ahmed, Wylie W; Fodor, Étienne; Betz, Timo
2015-11-01
Living cells are active mechanical systems that are able to generate forces. Their structure and shape are primarily determined by biopolymer filaments and molecular motors that form the cytoskeleton. Active force generation requires constant consumption of energy to maintain the nonequilibrium activity to drive organization and transport processes necessary for their function. To understand this activity it is necessary to develop new approaches to probe the underlying physical processes. Active cell mechanics incorporates active molecular-scale force generation into the traditional framework of mechanics of materials. This review highlights recent experimental and theoretical developments towards understanding active cell mechanics. We focus primarily on intracellular mechanical measurements and theoretical advances utilizing the Langevin framework. These developing approaches allow a quantitative understanding of nonequilibrium mechanical activity in living cells. This article is part of a Special Issue entitled: Mechanobiology. Copyright © 2015. Published by Elsevier B.V.
Kaneko-Ishino, Tomoko; Ishino, Fumitoshi
2015-01-01
Mammals, including human beings, have evolved a unique viviparous reproductive system and a highly developed central nervous system. How did these unique characteristics emerge in mammalian evolution, and what kinds of changes did occur in the mammalian genomes as evolution proceeded? A key conceptual term in approaching these issues is "mammalian-specific genomic functions", a concept covering both mammalian-specific epigenetics and genetics. Genomic imprinting and LTR retrotransposon-derived genes are reviewed as the representative, mammalian-specific genomic functions that are essential not only for the current mammalian developmental system, but also mammalian evolution itself. First, the essential roles of genomic imprinting in mammalian development, especially related to viviparous reproduction via placental function, as well as the emergence of genomic imprinting in mammalian evolution, are discussed. Second, we introduce the novel concept of "mammalian-specific traits generated by mammalian-specific genes from LTR retrotransposons", based on the finding that LTR retrotransposons served as a critical driving force in the mammalian evolution via generating mammalian-specific genes.
Murphy, Andrew J; Macdonald, Lynn E; Stevens, Sean; Karow, Margaret; Dore, Anthony T; Pobursky, Kevin; Huang, Tammy T; Poueymirou, William T; Esau, Lakeisha; Meola, Melissa; Mikulka, Warren; Krueger, Pamela; Fairhurst, Jeanette; Valenzuela, David M; Papadopoulos, Nicholas; Yancopoulos, George D
2014-04-08
Mice genetically engineered to be humanized for their Ig genes allow for human antibody responses within a mouse background (HumAb mice), providing a valuable platform for the generation of fully human therapeutic antibodies. Unfortunately, existing HumAb mice do not have fully functional immune systems, perhaps because of the manner in which their genetic humanization was carried out. Heretofore, HumAb mice have been generated by disrupting the endogenous mouse Ig genes and simultaneously introducing human Ig transgenes at a different and random location; KO-plus-transgenic humanization. As we describe in the companion paper, we attempted to make mice that more efficiently use human variable region segments in their humoral responses by precisely replacing 6 Mb of mouse Ig heavy and kappa light variable region germ-line gene segments with their human counterparts while leaving the mouse constant regions intact, using a unique in situ humanization approach. We reasoned the introduced human variable region gene segments would function indistinguishably in their new genetic location, whereas the retained mouse constant regions would allow for optimal interactions and selection of the resulting antibodies within the mouse environment. We show that these mice, termed VelocImmune mice because they were generated using VelociGene technology, efficiently produce human:mouse hybrid antibodies (that are rapidly convertible to fully human antibodies) and have fully functional humoral immune systems indistinguishable from those of WT mice. The efficiency of the VelocImmune approach is confirmed by the rapid progression of 10 different fully human antibodies into human clinical trials.
Nonequilibrium Tuning of the Thermal Casimir Effect.
Dean, David S; Lu, Bing-Sui; Maggs, A C; Podgornik, Rudolf
2016-06-17
In net-neutral systems correlations between charge fluctuations generate strong attractive thermal Casimir forces and engineering these forces to optimize nanodevice performance is an important challenge. We show how the normal and lateral thermal Casimir forces between two plates containing Brownian charges can be modulated by decorrelating the system through the application of an electric field, which generates a nonequilibrium steady state with a constant current in one or both plates, reducing the ensuing fluctuation-generated normal force while at the same time generating a lateral drag force. This hypothesis is confirmed by detailed numerical simulations as well as an analytical approach based on stochastic density functional theory.
NASA Astrophysics Data System (ADS)
Kang, Angray S.; Barbas, Carlos F.; Janda, Kim D.; Benkovic, Stephen J.; Lerner, Richard A.
1991-05-01
We describe a method based on a phagemid vector with helper phage rescue for the construction and rapid analysis of combinatorial antibody Fab libraries. This approach should allow the generation and selection of many monoclonal antibodies. Antibody genes are expressed in concert with phage morphogenesis, thereby allowing incorporation of functional Fab molecules along the surface of filamentous phage. The power of the method depends upon the linkage of recognition and replication functions and is not limited to antibody molecules.
An investigation into the probabilistic combination of quasi-static and random accelerations
NASA Technical Reports Server (NTRS)
Schock, R. W.; Tuell, L. P.
1984-01-01
The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.
Application of Particle Swarm Optimization in Computer Aided Setup Planning
NASA Astrophysics Data System (ADS)
Kafashi, Sajad; Shakeri, Mohsen; Abedini, Vahid
2011-01-01
New researches are trying to integrate computer aided design (CAD) and computer aided manufacturing (CAM) environments. The role of process planning is to convert the design specification into manufacturing instructions. Setup planning has a basic role in computer aided process planning (CAPP) and significantly affects the overall cost and quality of machined part. This research focuses on the development for automatic generation of setups and finding the best setup plan in feasible condition. In order to computerize the setup planning process, three major steps are performed in the proposed system: a) Extraction of machining data of the part. b) Analyzing and generation of all possible setups c) Optimization to reach the best setup plan based on cost functions. Considering workshop resources such as machine tool, cutter and fixture, all feasible setups could be generated. Then the problem is adopted with technological constraints such as TAD (tool approach direction), tolerance relationship and feature precedence relationship to have a completely real and practical approach. The optimal setup plan is the result of applying the PSO (particle swarm optimization) algorithm into the system using cost functions. A real sample part is illustrated to demonstrate the performance and productivity of the system.
NASA Astrophysics Data System (ADS)
Guarnieri, Vittorio; Francini, Franco
1997-12-01
Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.
NASA Astrophysics Data System (ADS)
Schildgen, T. F.; Robinson, R. A. J.; Savi, S.; Bookhagen, B.; Tofelde, S.; Strecker, M. R.
2014-12-01
Numerical modelling informs risk assessment of tsunami generated by submarine slides; however, for large-scale slides modelling can be complex and computationally challenging. Many previous numerical studies have approximated slides as rigid blocks that moved according to prescribed motion. However, wave characteristics are strongly dependent on the motion of the slide and previous work has recommended that more accurate representation of slide dynamics is needed. We have used the finite-element, adaptive-mesh CFD model Fluidity, to perform multi-material simulations of deformable submarine slide-generated waves at real world scales for a 2D scenario in the Gulf of Mexico. Our high-resolution approach represents slide dynamics with good accuracy, compared to other numerical simulations of this scenario, but precludes tracking of wave propagation over large distances. To enable efficient modelling of further propagation of the waves, we investigate an approach to extract information about the slide evolution from our multi-material simulations in order to drive a single-layer wave propagation model, also using Fluidity, which is much less computationally expensive. The extracted submarine slide geometry and position as a function of time are parameterised using simple polynomial functions. The polynomial functions are used to inform a prescribed velocity boundary condition in a single-layer simulation, mimicking the effect the submarine slide motion has on the water column. The approach is verified by successful comparison of wave generation in the single-layer model with that recorded in the multi-material, multi-layer simulations. We then extend this approach to 3D for further validation of this methodology (using the Gulf of Mexico scenario proposed by Horrillo et al., 2013) and to consider the effect of lateral spreading. This methodology is then used to simulate a series of hypothetical submarine slide events in the Arctic Ocean (based on evidence of historic slides) and examine the hazard posed to the UK coast.
NASA Astrophysics Data System (ADS)
Restrepo, R. L.; Kasapoglu, E.; Sakiroglu, S.; Ungan, F.; Morales, A. L.; Duque, C. A.
2017-09-01
The effects of electric and magnetic fields on the second and third harmonic generation coefficients in a Morse potential quantum well are theoretically studied. The energy levels and corresponding wave functions are obtained by solving the Schrödinger equation for the electron in the parabolic band scheme and effective mass approximations and the envelope function approach. The results show that both the electric and the magnetic fields have significant influence on the magnitudes and resonant peak energy positions of the second and third harmonic generation responses. In general, the Morse potential profile becomes wider and shallower as γ -parameter increases and so the energies of the bound states will be functions of this parameter. Therefore, we can conclude that the effects of the electric and magnetic fields can be used to tune and control the optical properties of interest in the range of the infrared electromagnetic spectrum.
Moin, Mazahar; Bakshi, Achala; Saha, Anusree; Dutta, Mouboni; Kirti, P B
2017-07-01
The epitome of any genome research is to identify all the existing genes in a genome and investigate their roles. Various techniques have been applied to unveil the functions either by silencing or over-expressing the genes by targeted expression or random mutagenesis. Rice is the most appropriate model crop for generating a mutant resource for functional genomic studies because of the availability of high-quality genome sequence and relatively smaller genome size. Rice has syntenic relationships with members of other cereals. Hence, characterization of functionally unknown genes in rice will possibly provide key genetic insights and can lead to comparative genomics involving other cereals. The current review attempts to discuss the available gain-of-function mutagenesis techniques for functional genomics, emphasizing the contemporary approach, activation tagging and alterations to this method for the enhancement of yield and productivity of rice. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Quantitative myocardial perfusion from static cardiac and dynamic arterial CT
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.
2018-05-01
Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.
A trait-based approach for examining microbial community assembly
NASA Astrophysics Data System (ADS)
Prest, T. L.; Nemergut, D.
2015-12-01
Microorganisms regulate all of Earth's major biogeochemical cycles and an understanding of how microbial communities assemble is a key part in evaluating controls over many types of ecosystem processes. Rapid advances in technology and bioinformatics have led to a better appreciation for the variation in microbial community structure in time and space. Yet, advances in theory are necessary to make sense of these data and allow us to generate unifying hypotheses about the causes and consequences of patterns in microbial biodiversity and what they mean for ecosystem function. Here, I will present a metaanalysis of microbial community assembly from a variety of successional and post-disturbance systems. Our analysis shows various distinct patterns in community assembly, and the potential importance of nutrients and dispersal in shaping microbial community beta diversity in these systems. We also used a trait-based approach to generate hypotheses about the mechanisms driving patterns of microbial community assembly and the implications for function. Our work reveals the importance of rRNA operon copy number as a community aggregated trait in helping to reconcile differences in community dynamics between distinct types of successional and disturbed systems. Specifically, our results demonstrate that decreases in average copy number can be a common feature of communities across various drivers of ecological succession, supporting a transition from an r-selected to a K-selected community. Importantly, our work supports the scaling of the copy number trait over multiple levels of biological organization, from cells to populations and communities, and has implications for both ecology and evolution. Trait-based approaches are an important next step to generate and test hypotheses about the forces structuring microbial communities and the subsequent consequences for ecosystem function.
Reduced Specificity of Personal Goals and Explanations for Goal Attainment in Major Depression
Dickson, Joanne M.; Moberly, Nicholas J.
2013-01-01
Objectives Overgeneralization has been investigated across many domains of cognitive functioning in major depression, including the imagination of future events. However, it is unknown whether this phenomenon extends to representations of personal goals, which are important in structuring long-term behaviour and providing meaning in life. Furthermore, it is not clear whether depressed individuals provide less specific explanations for and against goal attainment. Method Clinically depressed individuals and controls generated personally important approach and avoidance goals, and then generated explanations why they would and would not achieve these goals. Goals and causal explanations were subsequently coded as either specific or general. Results Compared to controls, depressed individuals did not generate significantly fewer goals or causal explanations for or against goal attainment. However, compared to controls, depressed individuals generated less specific goals, less specific explanations for approach (but not avoidance) goal attainment, and less specific explanations for goal nonattainment. Significance Our results suggest that motivational deficits in depression may stem partly from a reduction in the specificity of personal goal representations and related cognitions that support goal-directed behaviour. Importantly, the findings have the potential to inform the ongoing development of psychotherapeutic approaches in the treatment of depression. PMID:23691238
Engineered kinesin motor proteins amenable to small-molecule inhibition
Engelke, Martin F.; Winding, Michael; Yue, Yang; Shastry, Shankar; Teloni, Federico; Reddy, Sanjay; Blasius, T. Lynne; Soppina, Pushpanjali; Hancock, William O.; Gelfand, Vladimir I.; Verhey, Kristen J.
2016-01-01
The human genome encodes 45 kinesin motor proteins that drive cell division, cell motility, intracellular trafficking and ciliary function. Determining the cellular function of each kinesin would benefit from specific small-molecule inhibitors. However, screens have yielded only a few specific inhibitors. Here we present a novel chemical-genetic approach to engineer kinesin motors that can carry out the function of the wild-type motor yet can also be efficiently inhibited by small, cell-permeable molecules. Using kinesin-1 as a prototype, we develop two independent strategies to generate inhibitable motors, and characterize the resulting inhibition in single-molecule assays and in cells. We further apply these two strategies to create analogously inhibitable kinesin-3 motors. These inhibitable motors will be of great utility to study the functions of specific kinesins in a dynamic manner in cells and animals. Furthermore, these strategies can be used to generate inhibitable versions of any motor protein of interest. PMID:27045608
Sablok, Gaurav; Pérez-Pulido, Antonio J.; Do, Thac; Seong, Tan Y.; Casimiro-Soriguer, Carlos S.; La Porta, Nicola; Ralph, Peter J.; Squartini, Andrea; Muñoz-Merida, Antonio; Harikrishna, Jennifer A.
2016-01-01
Analysis of repetitive DNA sequence content and divergence among the repetitive functional classes is a well-accepted approach for estimation of inter- and intra-generic differences in plant genomes. Among these elements, microsatellites, or Simple Sequence Repeats (SSRs), have been widely demonstrated as powerful genetic markers for species and varieties discrimination. We present PlantFuncSSRs platform having more than 364 plant species with more than 2 million functional SSRs. They are provided with detailed annotations for easy functional browsing of SSRs and with information on primer pairs and associated functional domains. PlantFuncSSRs can be leveraged to identify functional-based genic variability among the species of interest, which might be of particular interest in developing functional markers in plants. This comprehensive on-line portal unifies mining of SSRs from first and next generation sequencing datasets, corresponding primer pairs and associated in-depth functional annotation such as gene ontology annotation, gene interactions and its identification from reference protein databases. PlantFuncSSRs is freely accessible at: http://www.bioinfocabd.upo.es/plantssr. PMID:27446111
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Joshua M.
Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less
ERIC Educational Resources Information Center
Gelfand, Stanley A.; Gelfand, Jessica T.
2012-01-01
Method: Complete psychometric functions for phoneme and word recognition scores at 8 signal-to-noise ratios from -15 dB to 20 dB were generated for the first 10, 20, and 25, as well as all 50, three-word presentations of the Tri-Word or Computer Assisted Speech Recognition Assessment (CASRA) Test (Gelfand, 1998) based on the results of 12…
McGraw, P; Mathews, V P; Wang, Y; Phillips, M D
2001-05-01
Functional MR imaging (fMRI) has been a useful tool in the evaluation of language both in normal individuals and patient populations. The purpose of this article is to use various models of language as a framework to review fMRI studies. Specifically, fMRI language studies are subdivided into the following categories: word generation or fluency, passive listening, orthography, phonology, semantics, and syntax.
Functional myogenic engraftment from mouse iPS cells.
Darabi, Radbod; Pan, Weihong; Bosnakovski, Darko; Baik, June; Kyba, Michael; Perlingeiro, Rita C R
2011-11-01
Direct reprogramming of adult fibroblasts to a pluripotent state has opened new possibilities for the generation of patient- and disease-specific stem cells. However the ability of induced pluripotent stem (iPS) cells to generate tissue that mediates functional repair has been demonstrated in very few animal models of disease to date. Here we present the proof of principle that iPS cells may be used effectively for the treatment of muscle disorders. We combine the generation of iPS cells with conditional expression of Pax7, a robust approach to derive myogenic progenitors. Transplantation of Pax7-induced iPS-derived myogenic progenitors into dystrophic mice results in extensive engraftment, which is accompanied by improved contractility of treated muscles. These findings demonstrate the myogenic regenerative potential of iPS cells and provide rationale for their future therapeutic application for muscular dystrophies.
HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics
NASA Astrophysics Data System (ADS)
Wiebusch, Martin
2015-10-01
This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.
Mesenchymal Stem Cell-Mediated Functional Tooth Regeneration in Swine
Fang, Dianji; Yamaza, Takayoshi; Seo, Byoung-Moo; Zhang, Chunmei; Liu, He; Gronthos, Stan; Wang, Cun-Yu; Shi, Songtao; Wang, Songlin
2006-01-01
Mesenchymal stem cell-mediated tissue regeneration is a promising approach for regenerative medicine for a wide range of applications. Here we report a new population of stem cells isolated from the root apical papilla of human teeth (SCAP, stem cells from apical papilla). Using a minipig model, we transplanted both human SCAP and periodontal ligament stem cells (PDLSCs) to generate a root/periodontal complex capable of supporting a porcelain crown, resulting in normal tooth function. This work integrates a stem cell-mediated tissue regeneration strategy, engineered materials for structure, and current dental crown technologies. This hybridized tissue engineering approach led to recovery of tooth strength and appearance. PMID:17183711
Green Approach to Nanomaterials: Sustainable Utility of Nano-Catalysts
The presentation summarizes our synthetic activity for the preparation of nanoparticles involving benign alternatives which reduces or eliminates the use and generation of hazardous substances. Vitamins B1, B2, C, and tea and wine polyphenols which function both as reducing and c...
Bioengineering strategies to generate artificial protein complexes.
Kim, Heejae; Siu, Ka-Hei; Raeeszadeh-Sarmazdeh, Maryam; Sun, Qing; Chen, Qi; Chen, Wilfred
2015-08-01
For many applications, increasing synergy between distinct proteins through organization is important for the specificity, regulation, and overall reaction efficiency. Although there are many examples of protein complexes in nature, a generalized method to create these complexes remains elusive. Many conventional techniques such as random chemical conjugation, physical adsorption onto surfaces, and encapsulation within matrices are imprecise approaches and can lead to deactivation of protein native functionalities. More "bio-friendly" approaches such as genetically fused proteins and biological scaffolds often can result in low yields and low complex stability. Alternatively, site-specific protein conjugation or ligation can generate artificial protein complexes that preserve the native functionalities of protein domains and maintain stability through covalent bonds. In this review, we describe three distinct methods to synthesize artificial protein complexes (genetic incorPoration of unnatural amino acids to introduce bio-orthogonal azide and alkyne groups to proteins, split-intein based expressed protein ligation, and sortase mediated ligation) and highlight interesting applications for each technique. © 2015 Wiley Periodicals, Inc.
SCOS 2: A distributed architecture for ground system control
NASA Astrophysics Data System (ADS)
Keyte, Karl P.
The current generation of spacecraft ground control systems in use at the European Space Agency/European Space Operations Centre (ESA/ESOC) is based on the SCOS 1. Such systems have become difficult to manage in both functional and financial terms. The next generation of spacecraft is demanding more flexibility in the use, configuration and distribution of control facilities as well as functional requirements capable of matching those being planned for future missions. SCOS 2 is more than a successor to SCOS 1. Many of the shortcomings of the existing system have been carefully analyzed by user and technical communities and a complete redesign was made. Different technologies were used in many areas including hardware platform, network architecture, user interfaces and implementation techniques, methodologies and language. As far as possible a flexible design approach has been made using popular industry standards to provide vendor independence in both hardware and software areas. This paper describes many of the new approaches made in the architectural design of the SCOS 2.
The hypothalamic slice approach to neuroendocrinology.
Hatton, G I
1983-07-01
The magnocellular peptidergic cells of the supraoptic and paraventricular nuclei comprise much of what is known as the hypothalamo-neurohypophysial system and is involved in several functions, including body fluid balance, parturition and lactation. While we have learned much from experiments in vivo, they have not produced a clear understanding of some of the crucial features associated with the functioning of this system. In particular, questions relating to the osmosensitivity of magnocellular neurones and the mechanism(s) by which their characteristic firing patterns are generated have not been answered using the older approaches. Electrophysiological studies with brain slices present direct evidence for osmosensitivity, and perhaps even osmoreceptivity, of magnocellular neurones. Other evidence indicates that the phasic bursting patterns of activity associated with vasopressin-releasing neurones (a) occur in the absence of patterned chemical synaptic input, (b) may be modulated by electrotonic conduction across gap junctions connecting magnocellular neurones and (c) are likely to be generated by endogenous membrane currents. These results make untenable the formerly held idea that phasic bursting activity is dependent upon recurrent synaptic inhibition.
Generation of Mouse Lung Epithelial Cells.
Kasinski, Andrea L; Slack, Frank J
2013-08-05
Although in vivo models are excellent for assessing various facets of whole organism physiology, pathology, and overall response to treatments, evaluating basic cellular functions, and molecular events in mammalian model systems is challenging. It is therefore advantageous to perform these studies in a refined and less costly setting. One approach involves utilizing cells derived from the model under evaluation. The approach to generate such cells varies based on the cell of origin and often the genetics of the cell. Here we describe the steps involved in generating epithelial cells from the lungs of Kras LSL-G12D/+ ; p53 LSL-R172/+ mice (Kasinski and Slack, 2012). These mice develop aggressive lung adenocarcinoma following cre-recombinase dependent removal of a stop cassette in the transgenes and subsequent expression of Kra -G12D and p53 R172 . While this protocol may be useful for the generation of epithelial lines from other genetic backgrounds, it should be noted that the Kras; p53 cell line generated here is capable of proliferating in culture without any additional genetic manipulation that is often needed for less aggressive backgrounds.
Zhu, Xiaoxiao; Xu, Yajie; Yu, Shanshan; Lu, Lu; Ding, Mingqin; Cheng, Jing; Song, Guoxu; Gao, Xing; Yao, Liangming; Fan, Dongdong; Meng, Shu; Zhang, Xuewen; Hu, Shengdi; Tian, Yong
2014-09-19
The rapid generation of various species and strains of laboratory animals using CRISPR/Cas9 technology has dramatically accelerated the interrogation of gene function in vivo. So far, the dominant approach for genotyping of genome-modified animals has been the T7E1 endonuclease cleavage assay. Here, we present a polyacrylamide gel electrophoresis-based (PAGE) method to genotype mice harboring different types of indel mutations. We developed 6 strains of genome-modified mice using CRISPR/Cas9 system, and utilized this approach to genotype mice from F0 to F2 generation, which included single and multiplexed genome-modified mice. We also determined the maximal detection sensitivity for detecting mosaic DNA using PAGE-based assay as 0.5%. We further applied PAGE-based genotyping approach to detect CRISPR/Cas9-mediated on- and off-target effect in human 293T and induced pluripotent stem cells (iPSCs). Thus, PAGE-based genotyping approach meets the rapidly increasing demand for genotyping of the fast-growing number of genome-modified animals and human cell lines created using CRISPR/Cas9 system or other nuclease systems such as TALEN or ZFN.
Zhu, Xiaoxiao; Xu, Yajie; Yu, Shanshan; Lu, Lu; Ding, Mingqin; Cheng, Jing; Song, Guoxu; Gao, Xing; Yao, Liangming; Fan, Dongdong; Meng, Shu; Zhang, Xuewen; Hu, Shengdi; Tian, Yong
2014-01-01
The rapid generation of various species and strains of laboratory animals using CRISPR/Cas9 technology has dramatically accelerated the interrogation of gene function in vivo. So far, the dominant approach for genotyping of genome-modified animals has been the T7E1 endonuclease cleavage assay. Here, we present a polyacrylamide gel electrophoresis-based (PAGE) method to genotype mice harboring different types of indel mutations. We developed 6 strains of genome-modified mice using CRISPR/Cas9 system, and utilized this approach to genotype mice from F0 to F2 generation, which included single and multiplexed genome-modified mice. We also determined the maximal detection sensitivity for detecting mosaic DNA using PAGE-based assay as 0.5%. We further applied PAGE-based genotyping approach to detect CRISPR/Cas9-mediated on- and off-target effect in human 293T and induced pluripotent stem cells (iPSCs). Thus, PAGE-based genotyping approach meets the rapidly increasing demand for genotyping of the fast-growing number of genome-modified animals and human cell lines created using CRISPR/Cas9 system or other nuclease systems such as TALEN or ZFN. PMID:25236476
Tight-binding calculation of single-band and generalized Wannier functions of graphene
NASA Astrophysics Data System (ADS)
Ribeiro, Allan Victor; Bruno-Alfonso, Alexys
Recent work has shown that a tight-binding approach associated with Wannier functions (WFs) provides an intuitive physical image of the electronic structure of graphene. Regarding the case of graphene, Marzari et al. displayed the calculated WFs and presented a comparison between the Wannier-interpolated bands and the bands generated by using the density-functional code. Jung and MacDonald provided a tight-binding model for the π-bands of graphene that involves maximally localized Wannier functions (MLWFs). The mixing of the bands yields better localized WFs. In the present work, the MLWFs of graphene are calculated by combining the Quantum-ESPRESSO code and tight-binding approach. The MLWFs of graphene are calculated from the Bloch functions obtained through a tight binding approach that includes interactions and overlapping obtained by partially fitting the DFT bands. The phase of the Bloch functions of each band is appropriately chosen to produce MLWFs. The same thing applies to the coefficients of their linear combination in the generalized case. The method allows for an intuitive understanding of the maximally localized WFs of graphene and shows excellent agreement with the literature. Moreover, it provides accurate results at reduced computational cost.
Guerra, Jorge; Uddin, Jasim; Nilsen, Dawn; Mclnerney, James; Fadoo, Ammarah; Omofuma, Isirame B.; Hughes, Shatif; Agrawal, Sunil; Allen, Peter; Schambra, Heidi M.
2017-01-01
There currently exist no practical tools to identify functional movements in the upper extremities (UEs). This absence has limited the precise therapeutic dosing of patients recovering from stroke. In this proof-of-principle study, we aimed to develop an accurate approach for classifying UE functional movement primitives, which comprise functional movements. Data were generated from inertial measurement units (IMUs) placed on upper body segments of older healthy individuals and chronic stroke patients. Subjects performed activities commonly trained during rehabilitation after stroke. Data processing involved the use of a sliding window to obtain statistical descriptors, and resulting features were processed by a Hidden Markov Model (HMM). The likelihoods of the states, resulting from the HMM, were segmented by a second sliding window and their averages were calculated. The final predictions were mapped to human functional movement primitives using a Logistic Regression algorithm. Algorithm performance was assessed with a leave-one-out analysis, which determined its sensitivity, specificity, and positive and negative predictive values for all classified primitives. In healthy control and stroke participants, our approach identified functional movement primitives embedded in training activities with, on average, 80% precision. This approach may support functional movement dosing in stroke rehabilitation. PMID:28813877
A generalized locomotion CPG architecture based on oscillatory building blocks.
Yang, Zhijun; França, Felipe M G
2003-07-01
Neural oscillation is one of the most extensively investigated topics of artificial neural networks. Scientific approaches to the functionalities of both natural and artificial intelligences are strongly related to mechanisms underlying oscillatory activities. This paper concerns itself with the assumption of the existence of central pattern generators (CPGs), which are the plausible neural architectures with oscillatory capabilities, and presents a discrete and generalized approach to the functionality of locomotor CPGs of legged animals. Based on scheduling by multiple edge reversal (SMER), a primitive and deterministic distributed algorithm, it is shown how oscillatory building block (OBB) modules can be created and, hence, how OBB-based networks can be formulated as asymmetric Hopfield-like neural networks for the generation of complex coordinated rhythmic patterns observed among pairs of biological motor neurons working during different gait patterns. It is also shown that the resulting Hopfield-like network possesses the property of reproducing the whole spectrum of different gaits intrinsic to the target locomotor CPGs. Although the new approach is not restricted to the understanding of the neurolocomotor system of any particular animal, hexapodal and quadrupedal gait patterns are chosen as illustrations given the wide interest expressed by the ongoing research in the area.
El Garah, Mohamed; Marets, Nicolas; Mauro, Matteo; Aliprandi, Alessandro; Bonacchi, Sara; De Cola, Luisa; Ciesielski, Artur; Bulach, Véronique; Hosseini, Mir Wais; Samorì, Paolo
2015-07-08
The self-assembly of multiple molecular components into complex supramolecular architectures is ubiquitous in nature and constitutes one of the most powerful strategies to fabricate multifunctional nanomaterials making use of the bottom-up approach. When spatial confinement in two dimensions on a solid substrate is employed, this approach can be exploited to generate periodically ordered structures from suitably designed molecular tectons. In this study we demonstrate that physisorbed directional periodic arrays of monometallic or heterobimetallic coordination polymers can be generated on a highly oriented pyrolitic graphite surface by combinations of a suitably designed directional organic tecton or metallatecton based on a porphyrin or nickel(II) metalloporphyrin backbone bearing both a pyridyl unit and a terpyridyl unit acting as coordinating sites for CoCl2. The periodic architectures were visualized at the solid/liquid interface with a submolecular resolution by scanning tunneling microscopy and corroborated by combined density functional and time-dependent density functional theory calculations. The capacity to nanopattern the surface for the first time with two distinct metallic centers exhibiting different electronic and optical properties is a key step toward the bottom-up construction of robust multicomponent and, thus, multifunctional molecular nanostructures and nanodevices.
New Approaches to the Computer Simulation of Amorphous Alloys: A Review.
Valladares, Ariel A; Díaz-Celaya, Juan A; Galván-Colín, Jonathan; Mejía-Mendoza, Luis M; Reyes-Retana, José A; Valladares, Renela M; Valladares, Alexander; Alvarez-Ramirez, Fernando; Qu, Dongdong; Shen, Jun
2011-04-13
In this work we review our new methods to computer generate amorphous atomic topologies of several binary alloys: SiH, SiN, CN; binary systems based on group IV elements like SiC; the GeSe 2 chalcogenide; aluminum-based systems: AlN and AlSi, and the CuZr amorphous alloy. We use an ab initio approach based on density functionals and computationally thermally-randomized periodically-continued cells with at least 108 atoms. The computational thermal process to generate the amorphous alloys is the undermelt-quench approach, or one of its variants, that consists in linearly heating the samples to just below their melting (or liquidus) temperatures, and then linearly cooling them afterwards. These processes are carried out from initial crystalline conditions using short and long time steps. We find that a step four-times the default time step is adequate for most of the simulations. Radial distribution functions (partial and total) are calculated and compared whenever possible with experimental results, and the agreement is very good. For some materials we report studies of the effect of the topological disorder on their electronic and vibrational densities of states and on their optical properties.
New Approaches to the Computer Simulation of Amorphous Alloys: A Review
Valladares, Ariel A.; Díaz-Celaya, Juan A.; Galván-Colín, Jonathan; Mejía-Mendoza, Luis M.; Reyes-Retana, José A.; Valladares, Renela M.; Valladares, Alexander; Alvarez-Ramirez, Fernando; Qu, Dongdong; Shen, Jun
2011-01-01
In this work we review our new methods to computer generate amorphous atomic topologies of several binary alloys: SiH, SiN, CN; binary systems based on group IV elements like SiC; the GeSe2 chalcogenide; aluminum-based systems: AlN and AlSi, and the CuZr amorphous alloy. We use an ab initio approach based on density functionals and computationally thermally-randomized periodically-continued cells with at least 108 atoms. The computational thermal process to generate the amorphous alloys is the undermelt-quench approach, or one of its variants, that consists in linearly heating the samples to just below their melting (or liquidus) temperatures, and then linearly cooling them afterwards. These processes are carried out from initial crystalline conditions using short and long time steps. We find that a step four-times the default time step is adequate for most of the simulations. Radial distribution functions (partial and total) are calculated and compared whenever possible with experimental results, and the agreement is very good. For some materials we report studies of the effect of the topological disorder on their electronic and vibrational densities of states and on their optical properties. PMID:28879948
An update on gain-of-function mutations in primary immunodeficiency diseases.
Jhamnani, Rekha D; Rosenzweig, Sergio D
2017-12-01
Most primary immunodeficiencies described since 1952 were associated with loss-of-function defects. With the advent and popularization of unbiased next-generation sequencing diagnostic approaches followed by functional validation techniques, many gain-of-function mutations leading to immunodeficiency have also been identified. This review highlights the updates on pathophysiology mechanisms and new therapeutic approaches involving primary immunodeficiencies because of gain-of-function mutations. The more recent developments related to gain-of-function primary immunodeficiencies mostly involving increased infection susceptibility but also immune dysregulation and autoimmunity, were reviewed. Updates regarding pathophysiology mechanisms, different mutation types, clinical features, laboratory markers, current and potential new treatments on patients with caspase recruitment domain family member 11, signal transducer and activator of transcription 1, signal transducer and activator of transcription 3, phosphatidylinositol-4,5-biphosphate 3-kinase catalytic 110, phosphatidylinositol-4,5-biphosphate 3-kinase regulatory subunit 1, chemokine C-X-C motif receptor 4, sterile α motif domain containing 9-like, and nuclear factor κ-B subunit 2 gain-of-function mutations are reviewed for each disease. With the identification of gain-of-function mutations as a cause of immunodeficiency, new genetic pathophysiology mechanisms unveiled and new-targeted therapeutic approaches can be explored as potential rescue treatments for these diseases.
Lanza, Amanda M.; Blazeck, John J.; Crook, Nathan C.; Alper, Hal S.
2012-01-01
Establishing causative links between protein functional domains and global gene regulation is critical for advancements in genetics, biotechnology, disease treatment, and systems biology. This task is challenging for multifunctional proteins when relying on traditional approaches such as gene deletions since they remove all domains simultaneously. Here, we describe a novel approach to extract quantitative, causative links by modulating the expression of a dominant mutant allele to create a function-specific competitive inhibition. Using the yeast histone acetyltransferase Gcn5p as a case study, we demonstrate the utility of this approach and (1) find evidence that Gcn5p is more involved in cell-wide gene repression, instead of the accepted gene activation associated with HATs, (2) identify previously unknown gene targets and interactions for Gcn5p-based acetylation, (3) quantify the strength of some Gcn5p-DNA associations, (4) demonstrate that this approach can be used to correctly identify canonical chromatin modifications, (5) establish the role of acetyltransferase activity on synthetic lethal interactions, and (6) identify new functional classes of genes regulated by Gcn5p acetyltransferase activity—all six of these major conclusions were unattainable by using standard gene knockout studies alone. We recommend that a graded dominant mutant approach be utilized in conjunction with a traditional knockout to study multifunctional proteins and generate higher-resolution data that more accurately probes protein domain function and influence. PMID:22558379
NASA Astrophysics Data System (ADS)
Khajehei, S.; Madadgar, S.; Moradkhani, H.
2014-12-01
The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).
Analysis of the Temperature and Strain-Rate Dependences of Strain Hardening
NASA Astrophysics Data System (ADS)
Kreyca, Johannes; Kozeschnik, Ernst
2018-01-01
A classical constitutive modeling-based Ansatz for the impact of thermal activation on the stress-strain response of metallic materials is compared with the state parameter-based Kocks-Mecking model. The predicted functional dependencies suggest that, in the first approach, only the dislocation storage mechanism is a thermally activated process, whereas, in the second approach, only the mechanism of dynamic recovery is. In contradiction to each of these individual approaches, our analysis and comparison with experimental evidence shows that thermal activation contributes both to dislocation generation and annihilation.
Huang, Wenwen; Ebrahimi, Davoud; Dinjaski, Nina; Tarakanova, Anna; Buehler, Markus J; Wong, Joyce Y; Kaplan, David L
2017-04-18
Tailored biomaterials with tunable functional properties are crucial for a variety of task-specific applications ranging from healthcare to sustainable, novel bio-nanodevices. To generate polymeric materials with predictive functional outcomes, exploiting designs from nature while morphing them toward non-natural systems offers an important strategy. Silks are Nature's building blocks and are produced by arthropods for a variety of uses that are essential for their survival. Due to the genetic control of encoded protein sequence, mechanical properties, biocompatibility, and biodegradability, silk proteins have been selected as prototype models to emulate for the tunable designs of biomaterial systems. The bottom up strategy of material design opens important opportunities to create predictive functional outcomes, following the exquisite polymeric templates inspired by silks. Recombinant DNA technology provides a systematic approach to recapitulate, vary, and evaluate the core structure peptide motifs in silks and then biosynthesize silk-based polymers by design. Post-biosynthesis processing allows for another dimension of material design by controlled or assisted assembly. Multiscale modeling, from the theoretical prospective, provides strategies to explore interactions at different length scales, leading to selective material properties. Synergy among experimental and modeling approaches can provide new and more rapid insights into the most appropriate structure-function relationships to pursue while also furthering our understanding in terms of the range of silk-based systems that can be generated. This approach utilizes nature as a blueprint for initial polymer designs with useful functions (e.g., silk fibers) but also employs modeling-guided experiments to expand the initial polymer designs into new domains of functional materials that do not exist in nature. The overall path to these new functional outcomes is greatly accelerated via the integration of modeling with experiment. In this Account, we summarize recent advances in understanding and functionalization of silk-based protein systems, with a focus on the integration of simulation and experiment for biopolymer design. Spider silk was selected as an exemplary protein to address the fundamental challenges in polymer designs, including specific insights into the role of molecular weight, hydrophobic/hydrophilic partitioning, and shear stress for silk fiber formation. To expand current silk designs toward biointerfaces and stimuli responsive materials, peptide modules from other natural proteins were added to silk designs to introduce new functions, exploiting the modular nature of silk proteins and fibrous proteins in general. The integrated approaches explored suggest that protein folding, silk volume fraction, and protein amino acid sequence changes (e.g., mutations) are critical factors for functional biomaterial designs. In summary, the integrated modeling-experimental approach described in this Account suggests a more rationally directed and more rapid method for the design of polymeric materials. It is expected that this combined use of experimental and computational approaches has a broad applicability not only for silk-based systems, but also for other polymer and composite materials.
Novel Functional Genomics Approaches: A Promising Future in the Combat Against Plant Viruses.
Fondong, Vincent N; Nagalakshmi, Ugrappa; Dinesh-Kumar, Savithramma P
2016-10-01
Advances in functional genomics and genome editing approaches have provided new opportunities and potential to accelerate plant virus control efforts through modification of host and viral genomes in a precise and predictable manner. Here, we discuss application of RNA-based technologies, including artificial micro RNA, transacting small interfering RNA, and Cas9 (clustered regularly interspaced short palindromic repeat-associated protein 9), which are currently being successfully deployed in generating virus-resistant plants. We further discuss the reverse genetics approach, targeting induced local lesions in genomes (TILLING) and its variant, known as EcoTILLING, that are used in the identification of plant virus recessive resistance gene alleles. In addition to describing specific applications of these technologies in plant virus control, this review discusses their advantages and limitations.
Refractive laser beam shaping by means of a functional differential equation based design approach.
Duerr, Fabian; Thienpont, Hugo
2014-04-07
Many laser applications require specific irradiance distributions to ensure optimal performance. Geometric optical design methods based on numerical calculation of two plano-aspheric lenses have been thoroughly studied in the past. In this work, we present an alternative new design approach based on functional differential equations that allows direct calculation of the rotational symmetric lens profiles described by two-point Taylor polynomials. The formalism is used to design a Gaussian to flat-top irradiance beam shaping system but also to generate a more complex dark-hollow Gaussian (donut-like) irradiance distribution with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of both calculated solutions and emphasize the potential of this design approach for refractive beam shaping applications.
Brzezicki, Samuel J.
2017-01-01
An analytical method to find the flow generated by the basic singularities of Stokes flow in a wedge of arbitrary angle is presented. Specifically, we solve a biharmonic equation for the stream function of the flow generated by a point stresslet singularity and satisfying no-slip boundary conditions on the two walls of the wedge. The method, which is readily adapted to any other singularity type, takes full account of any transcendental singularities arising at the corner of the wedge. The approach is also applicable to problems of plane strain/stress of an elastic solid where the biharmonic equation also governs the Airy stress function. PMID:28690412
Crowdy, Darren G; Brzezicki, Samuel J
2017-06-01
An analytical method to find the flow generated by the basic singularities of Stokes flow in a wedge of arbitrary angle is presented. Specifically, we solve a biharmonic equation for the stream function of the flow generated by a point stresslet singularity and satisfying no-slip boundary conditions on the two walls of the wedge. The method, which is readily adapted to any other singularity type, takes full account of any transcendental singularities arising at the corner of the wedge. The approach is also applicable to problems of plane strain/stress of an elastic solid where the biharmonic equation also governs the Airy stress function.
Multimission image processing and science data visualization
NASA Technical Reports Server (NTRS)
Green, William B.
1993-01-01
The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.
Hidden order and flux attachment in symmetry-protected topological phases: A Laughlin-like approach
NASA Astrophysics Data System (ADS)
Ringel, Zohar; Simon, Steven H.
2015-05-01
Topological phases of matter are distinct from conventional ones by their lack of a local order parameter. Still in the quantum Hall effect, hidden order parameters exist and constitute the basis for the celebrated composite-particle approach. Whether similar hidden orders exist in 2D and 3D symmetry protected topological phases (SPTs) is a largely open question. Here, we introduce a new approach for generating SPT ground states, based on a generalization of the Laughlin wave function. This approach gives a simple and unifying picture of some classes of SPTs in 1D and 2D, and reveals their hidden order and flux attachment structures. For the 1D case, we derive exact relations between the wave functions obtained in this manner and group cohomology wave functions, as well as matrix product state classification. For the 2D Ising SPT, strong analytical and numerical evidence is given to show that the wave function obtained indeed describes the desired SPT. The Ising SPT then appears as a state with quasi-long-range order in composite degrees of freedom consisting of Ising-symmetry charges attached to Ising-symmetry fluxes.
Stratway: A Modular Approach to Strategic Conflict Resolution
NASA Technical Reports Server (NTRS)
Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.
2011-01-01
In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.
Competitive Genomic Screens of Barcoded Yeast Libraries
Urbanus, Malene; Proctor, Michael; Heisler, Lawrence E.; Giaever, Guri; Nislow, Corey
2011-01-01
By virtue of advances in next generation sequencing technologies, we have access to new genome sequences almost daily. The tempo of these advances is accelerating, promising greater depth and breadth. In light of these extraordinary advances, the need for fast, parallel methods to define gene function becomes ever more important. Collections of genome-wide deletion mutants in yeasts and E. coli have served as workhorses for functional characterization of gene function, but this approach is not scalable, current gene-deletion approaches require each of the thousands of genes that comprise a genome to be deleted and verified. Only after this work is complete can we pursue high-throughput phenotyping. Over the past decade, our laboratory has refined a portfolio of competitive, miniaturized, high-throughput genome-wide assays that can be performed in parallel. This parallelization is possible because of the inclusion of DNA 'tags', or 'barcodes,' into each mutant, with the barcode serving as a proxy for the mutation and one can measure the barcode abundance to assess mutant fitness. In this study, we seek to fill the gap between DNA sequence and barcoded mutant collections. To accomplish this we introduce a combined transposon disruption-barcoding approach that opens up parallel barcode assays to newly sequenced, but poorly characterized microbes. To illustrate this approach we present a new Candida albicans barcoded disruption collection and describe how both microarray-based and next generation sequencing-based platforms can be used to collect 10,000 - 1,000,000 gene-gene and drug-gene interactions in a single experiment. PMID:21860376
Topology optimized design of functionally graded piezoelectric ultrasonic transducers
NASA Astrophysics Data System (ADS)
Rubio, Wilfredo Montealegre; Buiochi, Flávio; Adamowski, Julio Cezar; Silva, Emílio C. N.
2010-01-01
This work presents a new approach to systematically design piezoelectric ultrasonic transducers based on Topology Optimization Method (TOM) and Functionally Graded Material (FGM) concepts. The main goal is to find the optimal material distribution of Functionally Graded Piezoelectric Ultrasonic Transducers, to achieve the following requirements: (i) the transducer must be designed to have a multi-modal or uni-modal frequency response, which defines the kind of generated acoustic wave, either short pulse or continuous wave, respectively; (ii) the transducer is required to oscillate in a thickness extensional mode or piston-like mode, aiming at acoustic wave generation applications. Two kinds of piezoelectric materials are mixed for producing the FGM transducer. Material type 1 represents a PZT-5A piezoelectric ceramic and material type 2 represents a PZT-5H piezoelectric ceramic. To illustrate the proposed method, two Functionally Graded Piezoelectric Ultrasonic Transducers are designed. The TOM has shown to be a useful tool for designing Functionally Graded Piezoelectric Ultrasonic Transducers with uni-modal or multi-modal dynamic behavior.
Xu, Chuanhui; Cao, Liming; Lin, Baofeng; Liang, Xingquan; Chen, Yukun
2016-07-13
Introducing ionic associations is one of the most effective approaches to realize a self-healing behavior for rubbers. However, most of commercial rubbers are nonpolar rubbers without now available functional groups to be converted into ionic groups. In this paper, our strategy was based on a controlled peroxide-induced vulcanization to generate massive ionic cross-links via polymerization of zinc dimethacrylate (ZDMA) in natural rubber (NR) and exploited it as a potential self-healable material. We controlled vulcanization process to retard the formation of covalent cross-link network, and successfully generated a reversible supramolecular network mainly constructed by ionic cross-links. Without the restriction of covalent cross-linkings, the NR chains in ionic supramolecular network had good flexibility and mobility. The nature that the ionic cross-links was easily reconstructed and rearranged facilitating the self-healing behavior, thereby enabling a fully cut sample to rejoin and retain to its original properties after a suitable self-healing process at ambient temperature. This study thus demonstrates a feasible approach to impart an ionic association induced self-healing function to commercial rubbers without ionic functional groups.
A multiple scales approach to sound generation by vibrating bodies
NASA Technical Reports Server (NTRS)
Geer, James F.; Pope, Dennis S.
1992-01-01
The problem of determining the acoustic field in an inviscid, isentropic fluid generated by a solid body whose surface executes prescribed vibrations is formulated and solved as a multiple scales perturbation problem, using the Mach number M based on the maximum surface velocity as the perturbation parameter. Following the idea of multiple scales, new 'slow' spacial scales are introduced, which are defined as the usual physical spacial scale multiplied by powers of M. The governing nonlinear differential equations lead to a sequence of linear problems for the perturbation coefficient functions. However, it is shown that the higher order perturbation functions obtained in this manner will dominate the lower order solutions unless their dependence on the slow spacial scales is chosen in a certain manner. In particular, it is shown that the perturbation functions must satisfy an equation similar to Burgers' equation, with a slow spacial scale playing the role of the time-like variable. The method is illustrated by a simple one-dimenstional example, as well as by three different cases of a vibrating sphere. The results are compared with solutions obtained by purely numerical methods and some insights provided by the perturbation approach are discussed.
Automation of steam generator services at public service electric & gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cruickshank, H.; Wray, J.; Scull, D.
1995-03-01
Public Service Electric & Gas takes an aggressive approach to pursuing new exposure reduction techniques. Evaluation of historic outage exposure shows that over the last eight refueling outages, primary steam generator work has averaged sixty-six (66) person-rem, or, approximately tewenty-five percent (25%) of the general outage exposure at Salem Station. This maintenance evolution represents the largest percentage of exposure for any single activity. Because of this, primary steam generator work represents an excellent opportunity for the development of significant exposure reduction techniques. A study of primary steam generator maintenance activities demonstrated that seventy-five percent (75%) of radiation exposure was duemore » to work activities of the primary steam generator platform, and that development of automated methods for performing these activities was worth pursuing. Existing robotics systems were examined and it was found that a new approach would have to be developed. This resulted in a joint research and development project between Westinghouse and Public Service Electric & Gas to develop an automated system of accomplishing the Health Physics functions on the primary steam generator platform. R.O.M.M.R.S. (Remotely Operated Managed Maintenance Robotics System) was the result of this venture.« less
NASA Astrophysics Data System (ADS)
Sun, Changchun; Chen, Zhongtang; Xu, Qicheng
2017-12-01
An original three-dimensional (3D) smooth continuous chaotic system and its mirror-image system with eight common parameters are constructed and a pair of symmetric chaotic attractors can be generated simultaneously. Basic dynamical behaviors of two 3D chaotic systems are investigated respectively. A double-scroll chaotic attractor by connecting the pair of mutual mirror-image attractors is generated via a novel planar switching control approach. Chaos can also be controlled to a fixed point, a periodic orbit and a divergent orbit respectively by switching between two chaotic systems. Finally, an equivalent 3D chaotic system by combining two 3D chaotic systems with a switching law is designed by utilizing a sign function. Two circuit diagrams for realizing the double-scroll attractor are depicted by employing an improved module-based design approach.
NASA Astrophysics Data System (ADS)
May, Matthias M.; Lewerenz, Hans-Joachim; Lackner, David; Dimroth, Frank; Hannappel, Thomas
2015-09-01
Photosynthesis is nature's route to convert intermittent solar irradiation into storable energy, while its use for an industrial energy supply is impaired by low efficiency. Artificial photosynthesis provides a promising alternative for efficient robust carbon-neutral renewable energy generation. The approach of direct hydrogen generation by photoelectrochemical water splitting utilizes customized tandem absorber structures to mimic the Z-scheme of natural photosynthesis. Here a combined chemical surface transformation of a tandem structure and catalyst deposition at ambient temperature yields photocurrents approaching the theoretical limit of the absorber and results in a solar-to-hydrogen efficiency of 14%. The potentiostatically assisted photoelectrode efficiency is 17%. Present benchmarks for integrated systems are clearly exceeded. Details of the in situ interface transformation, the electronic improvement and chemical passivation are presented. The surface functionalization procedure is widely applicable and can be precisely controlled, allowing further developments of high-efficiency robust hydrogen generators.
Dynamic Average-Value Modeling of Doubly-Fed Induction Generator Wind Energy Conversion Systems
NASA Astrophysics Data System (ADS)
Shahab, Azin
In a Doubly-fed Induction Generator (DFIG) wind energy conversion system, the rotor of a wound rotor induction generator is connected to the grid via a partial scale ac/ac power electronic converter which controls the rotor frequency and speed. In this research, detailed models of the DFIG wind energy conversion system with Sinusoidal Pulse-Width Modulation (SPWM) scheme and Optimal Pulse-Width Modulation (OPWM) scheme for the power electronic converter are developed in detail in PSCAD/EMTDC. As the computer simulation using the detailed models tends to be computationally extensive, time consuming and even sometimes not practical in terms of speed, two modified approaches (switching-function modeling and average-value modeling) are proposed to reduce the simulation execution time. The results demonstrate that the two proposed approaches reduce the simulation execution time while the simulation results remain close to those obtained using the detailed model simulation.
May, Matthias M.; Lewerenz, Hans-Joachim; Lackner, David; Dimroth, Frank; Hannappel, Thomas
2015-01-01
Photosynthesis is nature's route to convert intermittent solar irradiation into storable energy, while its use for an industrial energy supply is impaired by low efficiency. Artificial photosynthesis provides a promising alternative for efficient robust carbon-neutral renewable energy generation. The approach of direct hydrogen generation by photoelectrochemical water splitting utilizes customized tandem absorber structures to mimic the Z-scheme of natural photosynthesis. Here a combined chemical surface transformation of a tandem structure and catalyst deposition at ambient temperature yields photocurrents approaching the theoretical limit of the absorber and results in a solar-to-hydrogen efficiency of 14%. The potentiostatically assisted photoelectrode efficiency is 17%. Present benchmarks for integrated systems are clearly exceeded. Details of the in situ interface transformation, the electronic improvement and chemical passivation are presented. The surface functionalization procedure is widely applicable and can be precisely controlled, allowing further developments of high-efficiency robust hydrogen generators. PMID:26369620
Khambhati, Ankit N.; Davis, Kathryn A.; Oommen, Brian S.; Chen, Stephanie H.; Lucas, Timothy H.; Litt, Brian; Bassett, Danielle S.
2015-01-01
The epileptic network is characterized by pathologic, seizure-generating ‘foci’ embedded in a web of structural and functional connections. Clinically, seizure foci are considered optimal targets for surgery. However, poor surgical outcome suggests a complex relationship between foci and the surrounding network that drives seizure dynamics. We developed a novel technique to objectively track seizure states from dynamic functional networks constructed from intracranial recordings. Each dynamical state captures unique patterns of network connections that indicate synchronized and desynchronized hubs of neural populations. Our approach suggests that seizures are generated when synchronous relationships near foci work in tandem with rapidly changing desynchronous relationships from the surrounding epileptic network. As seizures progress, topographical and geometrical changes in network connectivity strengthen and tighten synchronous connectivity near foci—a mechanism that may aid seizure termination. Collectively, our observations implicate distributed cortical structures in seizure generation, propagation and termination, and may have practical significance in determining which circuits to modulate with implantable devices. PMID:26680762
An Application of Probability to Combinatorics: A Proof of Vandermonde Identity
ERIC Educational Resources Information Center
Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni
2017-01-01
In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using "induction", "polynomial identities" nor "generating functions", and will give a proof of the "Vandermonde…
Robust stochastic optimization for reservoir operation
NASA Astrophysics Data System (ADS)
Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin
2015-01-01
Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.
Kobayashi, Toshihiro; Kato-Itoh, Megumi; Nakauchi, Hiromitsu
2015-01-15
Generation of functional organs from patients' own cells is one of the ultimate goals of regenerative medicine. As a novel approach to creation of organs from pluripotent stem cells (PSCs), we employed blastocyst complementation in organogenesis-disabled animals and successfully generated PSC-derived pancreas and kidneys. Blastocyst complementation, which exploits the capacity of PSCs to participate in forming chimeras, does not, however, exclude contribution of PSCs to the development of tissues-including neural cells or germ cells-other than those specifically targeted by disabling of organogenesis. This fact provokes ethical controversy if human PSCs are to be used. In this study, we demonstrated that forced expression of Mix-like protein 1 (encoded by Mixl1) can be used to guide contribution of mouse embryonic stem cells to endodermal organs after blastocyst injection. We then succeeded in applying this method to generate functional pancreas in pancreatogenesis-disabled Pdx1 knockout mice using a newly developed tetraploid-based organ-complementation method. These findings hold promise for targeted organ generation from patients' own PSCs in livestock animals.
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
Next generation initiation techniques
NASA Technical Reports Server (NTRS)
Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans
1993-01-01
Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.
Rapid Assembly of Customized TALENs into Multiple Delivery Systems
Zhang, Zhengxing; Zhang, Siliang; Huang, Xin; Orwig, Kyle E.; Sheng, Yi
2013-01-01
Transcriptional activator-like effector nucleases (TALENs) have become a powerful tool for genome editing. Here we present an efficient TALEN assembly approach in which TALENs are assembled by direct Golden Gate ligation into Gateway® Entry vectors from a repeat variable di-residue (RVD) plasmid array. We constructed TALEN pairs targeted to mouse Ddx3 subfamily genes, and demonstrated that our modified TALEN assembly approach efficiently generates accurate TALEN moieties that effectively introduce mutations into target genes. We generated “user friendly” TALEN Entry vectors containing TALEN expression cassettes with fluorescent reporter genes that can be efficiently transferred via Gateway (LR) recombination into different delivery systems. We demonstrated that the TALEN Entry vectors can be easily transferred to an adenoviral delivery system to expand application to cells that are difficult to transfect. Since TALENs work in pairs, we also generated a TALEN Entry vector set that combines a TALEN pair into one PiggyBac transposon-based destination vector. The approach described here can also be modified for construction of TALE transcriptional activators, repressors or other functional domains. PMID:24244669
A versatile strategy for gene trapping and trap conversion in emerging model organisms.
Kontarakis, Zacharias; Pavlopoulos, Anastasios; Kiupakis, Alexandros; Konstantinides, Nikolaos; Douris, Vassilis; Averof, Michalis
2011-06-01
Genetic model organisms such as Drosophila, C. elegans and the mouse provide formidable tools for studying mechanisms of development, physiology and behaviour. Established models alone, however, allow us to survey only a tiny fraction of the morphological and functional diversity present in the animal kingdom. Here, we present iTRAC, a versatile gene-trapping approach that combines the implementation of unbiased genetic screens with the generation of sophisticated genetic tools both in established and emerging model organisms. The approach utilises an exon-trapping transposon vector that carries an integrase docking site, allowing the targeted integration of new constructs into trapped loci. We provide proof of principle for iTRAC in the emerging model crustacean Parhyale hawaiensis: we generate traps that allow specific developmental and physiological processes to be visualised in unparalleled detail, we show that trapped genes can be easily cloned from an unsequenced genome, and we demonstrate targeting of new constructs into a trapped locus. Using this approach, gene traps can serve as platforms for generating diverse reporters, drivers for tissue-specific expression, gene knockdown and other genetic tools not yet imagined.
Continuous-flow synthesis of functionalized phenols by aerobic oxidation of Grignard reagents.
He, Zhi; Jamison, Timothy F
2014-03-24
Phenols are important compounds in chemical industry. An economical and green approach to phenol preparation by the direct oxidation of aryl Grignard reagents using compressed air in continuous gas-liquid segmented flow systems is described. The process tolerates a broad range of functional groups, including oxidation-sensitive functionalities such as alkenes, amines, and thioethers. By integrating a benzyne-mediated in-line generation of arylmagnesium intermediates with the aerobic oxidation, a facile three-step, one-flow process, capable of preparing 2-functionalized phenols in a modular fashion, is established. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reverse-phase protein arrays (RPPA) represent a powerful functional proteomic approach to elucidate cancer-related molecular mechanisms and to develop novel cancer therapies. To facilitate community-based investigation of the large-scale protein expression data generated by this platform, we have developed a user-friendly, open-access bioinformatic resource, The Cancer Proteome Atlas (TCPA, http://tcpaportal.org), which contains two separate web applications.
Debien, Laurent; Zard, Samir Z
2013-03-13
A new radical addition/C-C bond fragmentation process is reported. Vinyl carbinols derived from 2-methyl-2-phenylpropanal react with radicals generated from xanthates to give the corresponding ketones. The radical cleavage reaction proceeds under mild conditions, in good to high yield, and in the presence of the unprotected carbinol. Highly functionalized 1,5-diketones and pyridines are readily available using this approach.
Efficient fractal-based mutation in evolutionary algorithms from iterated function systems
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.; Aybar-Ruíz, A.; Camacho-Gómez, C.; Pereira, E.
2018-03-01
In this paper we present a new mutation procedure for Evolutionary Programming (EP) approaches, based on Iterated Function Systems (IFSs). The new mutation procedure proposed consists of considering a set of IFS which are able to generate fractal structures in a two-dimensional phase space, and use them to modify a current individual of the EP algorithm, instead of using random numbers from different probability density functions. We test this new proposal in a set of benchmark functions for continuous optimization problems. In this case, we compare the proposed mutation against classical Evolutionary Programming approaches, with mutations based on Gaussian, Cauchy and chaotic maps. We also include a discussion on the IFS-based mutation in a real application of Tuned Mass Dumper (TMD) location and optimization for vibration cancellation in buildings. In both practical cases, the proposed EP with the IFS-based mutation obtained extremely competitive results compared to alternative classical mutation operators.
Unlocking Triticeae genomics to sustainably feed the future
Mochida, Keiichi; Shinozaki, Kazuo
2013-01-01
The tribe Triticeae includes the major crops wheat and barley. Within the last few years, the whole genomes of four Triticeae species—barley, wheat, Tausch’s goatgrass (Aegilops tauschii) and wild einkorn wheat (Triticum urartu)—have been sequenced. The availability of these genomic resources for Triticeae plants and innovative analytical applications using next-generation sequencing technologies are helping to revitalize our approaches in genetic work and to accelerate improvement of the Triticeae crops. Comparative genomics and integration of genomic resources from Triticeae plants and the model grass Brachypodium distachyon are aiding the discovery of new genes and functional analyses of genes in Triticeae crops. Innovative approaches and tools such as analysis of next-generation populations, evolutionary genomics and systems approaches with mathematical modeling are new strategies that will help us discover alleles for adaptive traits to future agronomic environments. In this review, we provide an update on genomic tools for use with Triticeae plants and Brachypodium and describe emerging approaches toward crop improvements in Triticeae. PMID:24204022
Genome-wide protein-protein interactions and protein function exploration in cyanobacteria
Lv, Qi; Ma, Weimin; Liu, Hui; Li, Jiang; Wang, Huan; Lu, Fang; Zhao, Chen; Shi, Tieliu
2015-01-01
Genome-wide network analysis is well implemented to study proteins of unknown function. Here, we effectively explored protein functions and the biological mechanism based on inferred high confident protein-protein interaction (PPI) network in cyanobacteria. We integrated data from seven different sources and predicted 1,997 PPIs, which were evaluated by experiments in molecular mechanism, text mining of literatures in proved direct/indirect evidences, and “interologs” in conservation. Combined the predicted PPIs with known PPIs, we obtained 4,715 no-redundant PPIs (involving 3,231 proteins covering over 90% of genome) to generate the PPI network. Based on the PPI network, terms in Gene ontology (GO) were assigned to function-unknown proteins. Functional modules were identified by dissecting the PPI network into sub-networks and analyzing pathway enrichment, with which we investigated novel function of underlying proteins in protein complexes and pathways. Examples of photosynthesis and DNA repair indicate that the network approach is a powerful tool in protein function analysis. Overall, this systems biology approach provides a new insight into posterior functional analysis of PPIs in cyanobacteria. PMID:26490033
SSME fault monitoring and diagnosis expert system
NASA Technical Reports Server (NTRS)
Ali, Moonis; Norman, Arnold M.; Gupta, U. K.
1989-01-01
An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.
Structural frequency functions for an impulsive, distributed forcing function
NASA Technical Reports Server (NTRS)
Bateman, Vesta I.
1987-01-01
The response of a penetrator structure to a spatially distributed mechanical impulse with a magnitude approaching field test force levels (1-2 Mlb) were measured. The frequency response function calculated from the response to this unique forcing function is compared to frequency response functions calculated from response to point forces of about 2000 pounds. The results show that the strain gages installed on the penetrator case respond similiarly to a point, axial force and to a spatially distributed, axial force. This result suggests that the distributed axial force generated in a penetration event may be reconstructed as a point axial force when the penetrator behaves in linear manner.
Nicola, Wilten; Tripp, Bryan; Scott, Matthew
2016-01-01
A fundamental question in computational neuroscience is how to connect a network of spiking neurons to produce desired macroscopic or mean field dynamics. One possible approach is through the Neural Engineering Framework (NEF). The NEF approach requires quantities called decoders which are solved through an optimization problem requiring large matrix inversion. Here, we show how a decoder can be obtained analytically for type I and certain type II firing rates as a function of the heterogeneity of its associated neuron. These decoders generate approximants for functions that converge to the desired function in mean-squared error like 1/N, where N is the number of neurons in the network. We refer to these decoders as scale-invariant decoders due to their structure. These decoders generate weights for a network of neurons through the NEF formula for weights. These weights force the spiking network to have arbitrary and prescribed mean field dynamics. The weights generated with scale-invariant decoders all lie on low dimensional hypersurfaces asymptotically. We demonstrate the applicability of these scale-invariant decoders and weight surfaces by constructing networks of spiking theta neurons that replicate the dynamics of various well known dynamical systems such as the neural integrator, Van der Pol system and the Lorenz system. As these decoders are analytically determined and non-unique, the weights are also analytically determined and non-unique. We discuss the implications for measured weights of neuronal networks. PMID:26973503
Nicola, Wilten; Tripp, Bryan; Scott, Matthew
2016-01-01
A fundamental question in computational neuroscience is how to connect a network of spiking neurons to produce desired macroscopic or mean field dynamics. One possible approach is through the Neural Engineering Framework (NEF). The NEF approach requires quantities called decoders which are solved through an optimization problem requiring large matrix inversion. Here, we show how a decoder can be obtained analytically for type I and certain type II firing rates as a function of the heterogeneity of its associated neuron. These decoders generate approximants for functions that converge to the desired function in mean-squared error like 1/N, where N is the number of neurons in the network. We refer to these decoders as scale-invariant decoders due to their structure. These decoders generate weights for a network of neurons through the NEF formula for weights. These weights force the spiking network to have arbitrary and prescribed mean field dynamics. The weights generated with scale-invariant decoders all lie on low dimensional hypersurfaces asymptotically. We demonstrate the applicability of these scale-invariant decoders and weight surfaces by constructing networks of spiking theta neurons that replicate the dynamics of various well known dynamical systems such as the neural integrator, Van der Pol system and the Lorenz system. As these decoders are analytically determined and non-unique, the weights are also analytically determined and non-unique. We discuss the implications for measured weights of neuronal networks.
Beigh, Mohammad Muzafar
2016-01-01
Humans have predicted the relationship between heredity and diseases for a long time. Only in the beginning of the last century, scientists begin to discover the connotations between different genes and disease phenotypes. Recent trends in next-generation sequencing (NGS) technologies have brought a great momentum in biomedical research that in turn has remarkably augmented our basic understanding of human biology and its associated diseases. State-of-the-art next generation biotechnologies have started making huge strides in our current understanding of mechanisms of various chronic illnesses like cancers, metabolic disorders, neurodegenerative anomalies, etc. We are experiencing a renaissance in biomedical research primarily driven by next generation biotechnologies like genomics, transcriptomics, proteomics, metabolomics, lipidomics etc. Although genomic discoveries are at the forefront of next generation omics technologies, however, their implementation into clinical arena had been painstakingly slow mainly because of high reaction costs and unavailability of requisite computational tools for large-scale data analysis. However rapid innovations and steadily lowering cost of sequence-based chemistries along with the development of advanced bioinformatics tools have lately prompted launching and implementation of large-scale massively parallel genome sequencing programs in different fields ranging from medical genetics, infectious biology, agriculture sciences etc. Recent advances in large-scale omics-technologies is bringing healthcare research beyond the traditional “bench to bedside” approach to more of a continuum that will include improvements, in public healthcare and will be primarily based on predictive, preventive, personalized, and participatory medicine approach (P4). Recent large-scale research projects in genetic and infectious disease biology have indicated that massively parallel whole-genome/whole-exome sequencing, transcriptome analysis, and other functional genomic tools can reveal large number of unique functional elements and/or markers that otherwise would be undetected by traditional sequencing methodologies. Therefore, latest trends in the biomedical research is giving birth to the new branch in medicine commonly referred to as personalized and/or precision medicine. Developments in the post-genomic era are believed to completely restructure the present clinical pattern of disease prevention and treatment as well as methods of diagnosis and prognosis. The next important step in the direction of the precision/personalized medicine approach should be its early adoption in clinics for future medical interventions. Consequently, in coming year’s next generation biotechnologies will reorient medical practice more towards disease prediction and prevention approaches rather than curing them at later stages of their development and progression, even at wider population level(s) for general public healthcare system. PMID:28930123
A new solution-adaptive grid generation method for transonic airfoil flow calculations
NASA Technical Reports Server (NTRS)
Nakamura, S.; Holst, T. L.
1981-01-01
The clustering algorithm is controlled by a second-order, ordinary differential equation which uses the airfoil surface density gradient as a forcing function. The solution to this differential equation produces a surface grid distribution which is automatically clustered in regions with large gradients. The interior grid points are established from this surface distribution by using an interpolation scheme which is fast and retains the desirable properties of the original grid generated from the standard elliptic equation approach.
Second-harmonic generation from a positive-negative index material heterostructure.
Mattiucci, Nadia; D'Aguanno, Giuseppe; Bloemer, Mark J; Scalora, Michael
2005-12-01
Resonant cavities have been widely used in the past to enhance material, nonlinear response. Traditional mirrors include metallic films and distributed Bragg reflectors. In this paper we propose negative index material mirrors as a third alternative. With the help of a rigorous Green function approach, we investigate second harmonic generation from single and coupled cavities, and theoretically prove that negative index material mirrors can raise the nonlinear conversion efficiency of a bulk material by at least four orders of magnitude compared to a bulk medium.
Enhanced protective role in materials with gradient structural orientations: Lessons from Nature.
Liu, Zengqian; Zhu, Yankun; Jiao, Da; Weng, Zhaoyong; Zhang, Zhefeng; Ritchie, Robert O
2016-10-15
Living organisms are adept at resisting contact deformation and damage by assembling protective surfaces with spatially varied mechanical properties, i.e., by creating functionally graded materials. Such gradients, together with multiple length-scale hierarchical structures, represent the two prime characteristics of many biological materials to be translated into engineering design. Here, we examine one design motif from a variety of biological tissues and materials where site-specific mechanical properties are generated for enhanced protection by adopting gradients in structural orientation over multiple length-scales, without manipulation of composition or microstructural dimension. Quantitative correlations are established between the structural orientations and local mechanical properties, such as stiffness, strength and fracture resistance; based on such gradients, the underlying mechanisms for the enhanced protective role of these materials are clarified. Theoretical analysis is presented and corroborated through numerical simulations of the indentation behavior of composites with distinct orientations. The design strategy of such bioinspired gradients is outlined in terms of the geometry of constituents. This study may offer a feasible approach towards generating functionally graded mechanical properties in synthetic materials for improved contact damage resistance. Living organisms are adept at resisting contact damage by assembling protective surfaces with spatially varied mechanical properties, i.e., by creating functionally-graded materials. Such gradients, together with multiple length-scale hierarchical structures, represent the prime characteristics of many biological materials. Here, we examine one design motif from a variety of biological tissues where site-specific mechanical properties are generated for enhanced protection by adopting gradients in structural orientation at multiple length-scales, without changes in composition or microstructural dimension. The design strategy of such bioinspired gradients is outlined in terms of the geometry of constituents. This study may offer a feasible approach towards generating functionally-graded mechanical properties in synthetic materials for improved damage resistance. Published by Elsevier Ltd.
Assessing the genetic overlap between BMI and cognitive function
Marioni, R E; Yang, J; Dykiert, D; Mõttus, R; Campbell, A; Ibrahim-Verbaas, Carla A; Bressler, Jan; Debette, Stephanie; Schuur, Maaike; Smith, Albert V; Davies, Gail; Bennett, David A; Deary, Ian J; Ikram, M Arfan; Launer, Lenore J; Fitzpatrick, Annette L; Seshadri, Sudha; van Duijn, Cornelia M; Mosely Jr, Thomas H; Davies, G; Hayward, C; Porteous, D J; Visscher, P M; Deary, I J
2016-01-01
Obesity and low cognitive function are associated with multiple adverse health outcomes across the life course. They have a small phenotypic correlation (r=−0.11; high body mass index (BMI)−low cognitive function), but whether they have a shared genetic aetiology is unknown. We investigated the phenotypic and genetic correlations between the traits using data from 6815 unrelated, genotyped members of Generation Scotland, an ethnically homogeneous cohort from five sites across Scotland. Genetic correlations were estimated using the following: same-sample bivariate genome-wide complex trait analysis (GCTA)–GREML; independent samples bivariate GCTA–GREML using Generation Scotland for cognitive data and four other samples (n=20 806) for BMI; and bivariate LDSC analysis using the largest genome-wide association study (GWAS) summary data on cognitive function (n=48 462) and BMI (n=339 224) to date. The GWAS summary data were also used to create polygenic scores for the two traits, with within- and cross-trait prediction taking place in the independent Generation Scotland cohort. A large genetic correlation of −0.51 (s.e. 0.15) was observed using the same-sample GCTA–GREML approach compared with −0.10 (s.e. 0.08) from the independent-samples GCTA–GREML approach and −0.22 (s.e. 0.03) from the bivariate LDSC analysis. A genetic profile score using cognition-specific genetic variants accounts for 0.08% (P=0.020) of the variance in BMI and a genetic profile score using BMI-specific variants accounts for 0.42% (P=1.9 × 10−7) of the variance in cognitive function. Seven common genetic variants are significantly associated with both traits at P<5 × 10−5, which is significantly more than expected by chance (P=0.007). All these results suggest there are shared genetic contributions to BMI and cognitive function. PMID:26857597
NASA Technical Reports Server (NTRS)
Nguyen, Huy H.; Martin, Michael A.
2004-01-01
The two most common approaches used to formulate thermodynamic properties of pure substances are fundamental (or characteristic) equations of state (Helmholtz and Gibbs functions) and a piecemeal approach that is described in Adebiyi and Russell (1992). This paper neither presents a different method to formulate thermodynamic properties of pure substances nor validates the aforementioned approaches. Rather its purpose is to present a method to generate property tables from existing property packages and a method to facilitate the accurate interpretation of fluid thermodynamic property data from those tables. There are two parts to this paper. The first part of the paper shows how efficient and usable property tables were generated, with the minimum number of data points, using an aerospace industry standard property package. The second part describes an innovative interpolation technique that has been developed to properly obtain thermodynamic properties near the saturated liquid and saturated vapor lines.
Numerical integration of discontinuous functions: moment fitting and smart octree
NASA Astrophysics Data System (ADS)
Hubrich, Simeon; Di Stolfo, Paolo; Kudela, László; Kollmannsberger, Stefan; Rank, Ernst; Schröder, Andreas; Düster, Alexander
2017-11-01
A fast and simple grid generation can be achieved by non-standard discretization methods where the mesh does not conform to the boundary or the internal interfaces of the problem. However, this simplification leads to discontinuous integrands for intersected elements and, therefore, standard quadrature rules do not perform well anymore. Consequently, special methods are required for the numerical integration. To this end, we present two approaches to obtain quadrature rules for arbitrary domains. The first approach is based on an extension of the moment fitting method combined with an optimization strategy for the position and weights of the quadrature points. In the second approach, we apply the smart octree, which generates curved sub-cells for the integration mesh. To demonstrate the performance of the proposed methods, we consider several numerical examples, showing that the methods lead to efficient quadrature rules, resulting in less integration points and in high accuracy.
Acetone-Linked Peptides: A Convergent Approach for Peptide Macrocyclization and Labeling.
Assem, Naila; Ferreira, David J; Wolan, Dennis W; Dawson, Philip E
2015-07-20
Macrocyclization is a broadly applied approach for overcoming the intrinsically disordered nature of linear peptides. Herein, it is shown that dichloroacetone (DCA) enhances helical secondary structures when introduced between peptide nucleophiles, such as thiols, to yield an acetone-linked bridge (ACE). Aside from stabilizing helical structures, the ketone moiety embedded in the linker can be modified with diverse molecular tags by oxime ligation. Insights into the structure of the tether were obtained through co-crystallization of a constrained S-peptide in complex with RNAse S. The scope of the acetone-linked peptides was further explored through the generation of N-terminus to side chain macrocycles and a new approach for generating fused macrocycles (bicycles). Together, these studies suggest that acetone linking is generally applicable to peptide macrocycles with a specific utility in the synthesis of stabilized helices that incorporate functional tags. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dimension reduction techniques for the integrative analysis of multi-omics data
Zeleznik, Oana A.; Thallinger, Gerhard G.; Kuster, Bernhard; Gholami, Amin M.
2016-01-01
State-of-the-art next-generation sequencing, transcriptomics, proteomics and other high-throughput ‘omics' technologies enable the efficient generation of large experimental data sets. These data may yield unprecedented knowledge about molecular pathways in cells and their role in disease. Dimension reduction approaches have been widely used in exploratory analysis of single omics data sets. This review will focus on dimension reduction approaches for simultaneous exploratory analyses of multiple data sets. These methods extract the linear relationships that best explain the correlated structure across data sets, the variability both within and between variables (or observations) and may highlight data issues such as batch effects or outliers. We explore dimension reduction techniques as one of the emerging approaches for data integration, and how these can be applied to increase our understanding of biological systems in normal physiological function and disease. PMID:26969681
Guo, Hao; Liu, Lei; Chen, Junjie; Xu, Yong; Jie, Xiang
2017-01-01
Functional magnetic resonance imaging (fMRI) is one of the most useful methods to generate functional connectivity networks of the brain. However, conventional network generation methods ignore dynamic changes of functional connectivity between brain regions. Previous studies proposed constructing high-order functional connectivity networks that consider the time-varying characteristics of functional connectivity, and a clustering method was performed to decrease computational cost. However, random selection of the initial clustering centers and the number of clusters negatively affected classification accuracy, and the network lost neurological interpretability. Here we propose a novel method that introduces the minimum spanning tree method to high-order functional connectivity networks. As an unbiased method, the minimum spanning tree simplifies high-order network structure while preserving its core framework. The dynamic characteristics of time series are not lost with this approach, and the neurological interpretation of the network is guaranteed. Simultaneously, we propose a multi-parameter optimization framework that involves extracting discriminative features from the minimum spanning tree high-order functional connectivity networks. Compared with the conventional methods, our resting-state fMRI classification method based on minimum spanning tree high-order functional connectivity networks greatly improved the diagnostic accuracy for Alzheimer's disease. PMID:29249926
NASA Astrophysics Data System (ADS)
Genxu, W.
2017-12-01
There is a lack of knowledge about how to quantify runoff generation and the hydrological processes operating in permafrost catchments on permafrost-dominant catchments. To understand the mechanism of runoff generation processes in permafrost catchments, a typical headwater catchment with continuous permafrost on the Tibetan Plateau was measured. A new approach is presented in this study to account for runoff processes on the spring thawing period and autumn freezing period, when runoff generation clearly differs from that of non-permafrost catchments. This approach introduces a soil temperature-based water saturation function and modifies the soil water storage curve with a soil temperature threshold. The results show that surface soil thawing induced saturation excess runoff and subsurface interflow account for approximately 66-86% and 14-34% of total spring runoff, respectively, and the soil temperature significantly affects the runoff generation pattern, the runoff composition and the runoff coefficient with the enlargement of the active layer. The suprapermafrost groundwater discharge decreases exponentially with active layer frozen processes during autumn runoff recession, whereas the ratio of groundwater discharge to total runoff and the direct surface runoff coefficient simultaneously increase. The bidirectional freezing of the active layer controls and changes the autumn runoff processes and runoff composition. The new approach could be used to further develop hydrological models of cold regions dominated by permafrost.
NASA Astrophysics Data System (ADS)
Satyaramesh, P. V.
2014-01-01
This paper presents an application of finite n-person non-cooperative game theory for analyzing bidding strategies of generators in a deregulated energy marketplace with Pool Bilateral contracts so as to maximize their net profits. A new methodology to build bidding methodology for generators participating in oligopoly electricity market has been proposed in this paper. It is assumed that each generator bids a supply function. This methodology finds out the coefficients in the supply function of generators in order to maximize benefits in an environment of competing rival bidders. A natural choice for developing strategies is Nash Equilibrium (NE) model incorporating mixed strategies, for solving the bidding problem of electrical market. Associated optimal profits are evaluated for a combination of set of pure strategies of bidding of generators, and payoff matrix has been constructed. The optimal payoff is calculated by using NE. An attempt has also been made to minimize the gap between the optimal payoff and the payoff obtained by a possible mixed strategies combination. The algorithm is coded in MATLAB. A numerical example is used to illustrate the essential features of the approach and the results are proved to be the optimal values.
Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation
NASA Astrophysics Data System (ADS)
Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri
2017-10-01
We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.
Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation.
Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri
2017-10-21
We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2E g energy threshold and with QE reaching ∼1.6 at about 3E g , where E g is the electronic gap.
Trajectory generation for an on-road autonomous vehicle
NASA Astrophysics Data System (ADS)
Horst, John; Barbera, Anthony
2006-05-01
We describe an algorithm that generates a smooth trajectory (position, velocity, and acceleration at uniformly sampled instants of time) for a car-like vehicle autonomously navigating within the constraints of lanes in a road. The technique models both vehicle paths and lane segments as straight line segments and circular arcs for mathematical simplicity and elegance, which we contrast with cubic spline approaches. We develop the path in an idealized space, warp the path into real space and compute path length, generate a one-dimensional trajectory along the path length that achieves target speeds and positions, and finally, warp, translate, and rotate the one-dimensional trajectory points onto the path in real space. The algorithm moves a vehicle in lane safely and efficiently within speed and acceleration maximums. The algorithm functions in the context of other autonomous driving functions within a carefully designed vehicle control hierarchy.
Joint sparse learning for 3-D facial expression generation.
Song, Mingli; Tao, Dacheng; Sun, Shengpeng; Chen, Chun; Bu, Jiajun
2013-08-01
3-D facial expression generation, including synthesis and retargeting, has received intensive attentions in recent years, because it is important to produce realistic 3-D faces with specific expressions in modern film production and computer games. In this paper, we present joint sparse learning (JSL) to learn mapping functions and their respective inverses to model the relationship between the high-dimensional 3-D faces (of different expressions and identities) and their corresponding low-dimensional representations. Based on JSL, we can effectively and efficiently generate various expressions of a 3-D face by either synthesizing or retargeting. Furthermore, JSL is able to restore 3-D faces with holes by learning a mapping function between incomplete and intact data. Experimental results on a wide range of 3-D faces demonstrate the effectiveness of the proposed approach by comparing with representative ones in terms of quality, time cost, and robustness.
Creation of structured documentation templates using Natural Language Processing techniques.
Kashyap, Vipul; Turchin, Alexander; Morin, Laura; Chang, Frank; Li, Qi; Hongsermeier, Tonya
2006-01-01
Structured Clinical Documentation is a fundamental component of the healthcare enterprise, linking both clinical (e.g., electronic health record, clinical decision support) and administrative functions (e.g., evaluation and management coding, billing). One of the challenges in creating good quality documentation templates has been the inability to address specialized clinical disciplines and adapt to local clinical practices. A one-size-fits-all approach leads to poor adoption and inefficiencies in the documentation process. On the other hand, the cost associated with manual generation of documentation templates is significant. Consequently there is a need for at least partial automation of the template generation process. We propose an approach and methodology for the creation of structured documentation templates for diabetes using Natural Language Processing (NLP).
Figure-ground segmentation based on class-independent shape priors
NASA Astrophysics Data System (ADS)
Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu
2018-01-01
We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
ERIC Educational Resources Information Center
Hotrum, Michael
2005-01-01
The traditional packaging of electronic learning--the learning management system (LMS)--is progressively being regarded as a hindrance to effective online learning. Its design, functionality, complexity, price, and value are being questioned. A new generation of Web-based tools and approaches is evolving that are better suited to meet the need for…
Greener Biomimetic Approach to the Synthesis of Nanomaterials and Nanocomposite
A brief account of greener production of nanoparticles which reduces or eliminates the use and generation of hazardous substances is presented. The utility of vitamins B1 and B2, which can function both as reducing and capping agents, provides an extremely simple, one-pot, greene...
The Impact of Large, Multi-Function/Multi-Site Competitions
2003-08-01
this approach generates larger savings and improved service quality , and is less expensive to implement. Moreover, it is a way to meet the President s...of the study is to assess the degree to which large-scale competitions completed have resulted in increased savings and service quality and decreased
Genome sequencing efforts in the past decade were aimed at generating draft sequences of many prokaryotic and eukaryotic model organisms. Successful completion of unicellular eukaryotes, worm, fly and human genome have opened up the new field of molecular biology and function...
The presentation summarizes our sustainable synthetic activity for the preparation of nanoparticles involving benign alternatives which reduces or eliminates the use and generation of hazardous substances. Vitamins B1, B2, C, and tea and wine polyphenols which function both as r...
Janssen-Müller, Daniel; Singha, Santanu; Olyschläger, Theresa; Daniliuc, Constantin G; Glorius, Frank
2016-09-02
The activation of 2-(bromomethyl)benzaldehydes using N-heterocyclic carbenes represents a novel approach to the generation of o-quinodimethane (o-QDM) intermediates. Coupling with ketones such as phenylglyoxylates, isatins, or trifluoromethyl ketones via [4 + 2] annulation gives access to functionalized 1-isochromanones.
Contemporary approaches to modulating the nitric oxide-cGMP pathway in cardiovascular disease
Kraehling, Jan R.; Sessa, William C.
2017-01-01
Endothelial cells lining the vessel wall control important aspects of vascular homeostasis. In particular, the production of endothelium-derived nitric oxide and activation of soluble guanylate cyclase promotes endothelial quiescence and governs vasomotor function and proportional remodeling of blood vessels. Here, we discuss novel approaches to improve endothelial nitric oxide generation and preserve its bioavailability. We also discuss therapeutic opportunities aimed at activation of soluble guanylate cyclase for multiple cardiovascular indications. PMID:28360348
NASA Astrophysics Data System (ADS)
Oz, Alon; Hershkovitz, Shany; Tsur, Yoed
2014-11-01
In this contribution we present a novel approach to analyze impedance spectroscopy measurements of supercapacitors. Transforming the impedance data into frequency-dependent capacitance allows us to use Impedance Spectroscopy Genetic Programming (ISGP) in order to find the distribution function of relaxation times (DFRT) of the processes taking place in the tested device. Synthetic data was generated in order to demonstrate this technique and a model for supercapacitor ageing process has been obtained.
NASA Technical Reports Server (NTRS)
Litvin, F.; Chen, J.; Seol, I.; Kim, D.; Lu, J.; Zhao, X.; Handschuh, R.
1996-01-01
A general approach developed for the computerized simulation of loaded gear drives is presented. In this paper the methodology used to localize the bearing contact, provide a parabolic function of transmission errors, and simulate meshing and contact of unloaded gear drives is developed. The approach developed is applied to spur and helical gears, spiral bevel gears, face-gear drives, and worm-gear drives with cylindrical worms.
Generating Dynamic Persistence in the Time Domain
NASA Astrophysics Data System (ADS)
Guerrero, A.; Smith, L. A.; Smith, L. A.; Kaplan, D. T.
2001-12-01
Many dynamical systems present long-range correlations. Physically, these systems vary from biological to economical, including geological or urban systems. Important geophysical candidates for this type of behaviour include weather (or climate) and earthquake sequences. Persistence is characterised by slowly decaying correlation function; that, in theory, never dies out. The Persistence exponent reflects the degree of memory in the system and much effort has been expended creating and analysing methods that successfully estimate this parameter and model data that exhibits persistence. The most widely used methods for generating long correlated time series are not dynamical systems in the time domain, but instead are derived from a given spectral density. Little attention has been drawn to modelling persistence in the time domain. The time domain approach has the advantage that an observation at certain time can be calculated using previous observations which is particularly suitable when investigating the predictability of a long memory process. We will describe two of these methods in the time domain. One is a traditional approach using fractional ARIMA (autoregressive and moving average) models; the second uses a novel approach to extending a given series using random Fourier basis functions. The statistical quality of the two methods is compared, and they are contrasted with weather data which shows, reportedly, persistence. The suitability of this approach both for estimating predictability and for making predictions is discussed.
Modular architectures for quantum networks
NASA Astrophysics Data System (ADS)
Pirker, A.; Wallnöfer, J.; Dür, W.
2018-05-01
We consider the problem of generating multipartite entangled states in a quantum network upon request. We follow a top-down approach, where the required entanglement is initially present in the network in form of network states shared between network devices, and then manipulated in such a way that the desired target state is generated. This minimizes generation times, and allows for network structures that are in principle independent of physical links. We present a modular and flexible architecture, where a multi-layer network consists of devices of varying complexity, including quantum network routers, switches and clients, that share certain resource states. We concentrate on the generation of graph states among clients, which are resources for numerous distributed quantum tasks. We assume minimal functionality for clients, i.e. they do not participate in the complex and distributed generation process of the target state. We present architectures based on shared multipartite entangled Greenberger–Horne–Zeilinger states of different size, and fully connected decorated graph states, respectively. We compare the features of these architectures to an approach that is based on bipartite entanglement, and identify advantages of the multipartite approach in terms of memory requirements and complexity of state manipulation. The architectures can handle parallel requests, and are designed in such a way that the network state can be dynamically extended if new clients or devices join the network. For generation or dynamical extension of the network states, we propose a quantum network configuration protocol, where entanglement purification is used to establish high fidelity states. The latter also allows one to show that the entanglement generated among clients is private, i.e. the network is secure.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
NASA Astrophysics Data System (ADS)
Garrison Kinney, R.; Tjutrins, Jevgenijs; Torres, Gerardo M.; Liu, Nina Jiabao; Kulkarni, Omkar; Arndtsen, Bruce A.
2018-02-01
The development of metal-catalysed methods to functionalize inert C-H bonds has become a dominant research theme in the past decade as an approach to efficient synthesis. However, the incorporation of carbon monoxide into such reactions to form valuable ketones has to date proved a challenge, despite its potential as a straightforward and green alternative to Friedel-Crafts reactions. Here we describe a new approach to palladium-catalysed C-H bond functionalization in which carbon monoxide is used to drive the generation of high-energy electrophiles. This offers a method to couple the useful features of metal-catalysed C-H functionalization (stable and available reagents) and electrophilic acylations (broad scope and selectivity), and synthesize ketones simply from aryl iodides, CO and arenes. Notably, the reaction proceeds in an intermolecular fashion, without directing groups and at very low palladium-catalyst loadings. Mechanistic studies show that the reaction proceeds through the catalytic build-up of potent aroyl triflate electrophiles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witt, Adam M; Smith, Brennan T
Small hydropower plants supply reliable renewable energy to the grid, though few new plants have been developed in the Unites States over the past few decades due to complex environmental challenges and poor project economics. This paper describes the current landscape of small hydropower development, and introduces a new approach to facility design that co-optimizes the extraction of hydroelectric power from a stream with other important environmental functions such as fish, sediment, and recreational passage. The approach considers hydropower facilities as an integrated system of standardized interlocking modules, designed to sustain stream functions, generate power, and interface with the streambed.more » It is hypothesized that this modular eco-design approach, when guided by input from the broader small hydropower stakeholder community, can lead to cost savings across the facility, reduced licensing and approval timelines, and ultimately, to enhanced resiliency through improved environmental performance over the lifetime of the project.« less
Ocklenburg, Sebastian; Hugdahl, Kenneth; Westerhausen, René
2013-12-01
Functional hemispheric asymmetries of speech production and perception are a key feature of the human language system, but their neurophysiological basis is still poorly understood. Using a combined fMRI and tract-based spatial statistics approach, we investigated the relation of microstructural asymmetries in language-relevant white matter pathways and functional activation asymmetries during silent verb generation and passive listening to spoken words. Tract-based spatial statistics revealed several leftward asymmetric clusters in the arcuate fasciculus and uncinate fasciculus that were differentially related to activation asymmetries in the two functional tasks. Frontal and temporal activation asymmetries during silent verb generation were positively related to the strength of specific microstructural white matter asymmetries in the arcuate fasciculus. In contrast, microstructural uncinate fasciculus asymmetries were related to temporal activation asymmetries during passive listening. These findings suggest that white matter asymmetries may indeed be one of the factors underlying functional hemispheric asymmetries. Moreover, they also show that specific localized white matter asymmetries might be of greater relevance for functional activation asymmetries than microstructural features of whole pathways. © 2013.
Combining Approach in Stages with Least Squares for fits of data in hyperelasticity
NASA Astrophysics Data System (ADS)
Beda, Tibi
2006-10-01
The present work concerns a method of continuous approximation by block of a continuous function; a method of approximation combining the Approach in Stages with the finite domains Least Squares. An identification procedure by sub-domains: basic generating functions are determined step-by-step permitting their weighting effects to be felt. This procedure allows one to be in control of the signs and to some extent of the optimal values of the parameters estimated, and consequently it provides a unique set of solutions that should represent the real physical parameters. Illustrations and comparisons are developed in rubber hyperelastic modeling. To cite this article: T. Beda, C. R. Mecanique 334 (2006).
Curchod, Basile F E; Penfold, Thomas J; Rothlisberger, Ursula; Tavernelli, Ivano
2013-01-01
The implementation of local control theory using nonadiabatic molecular dynamics within the framework of linear-response time-dependent density functional theory is discussed. The method is applied to study the photoexcitation of lithium fluoride, for which we demonstrate that this approach can efficiently generate a pulse, on-the-fly, able to control the population transfer between two selected electronic states. Analysis of the computed control pulse yields insights into the photophysics of the process identifying the relevant frequencies associated to the curvature of the initial and final state potential energy curves and their energy differences. The limitations inherent to the use of the trajectory surface hopping approach are also discussed.
Energy absorption by a magnetic nanoparticle suspension in a rotating field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raikher, Yu. L.; Stepanov, V. I., E-mail: stepanov@icmm.ru
Heat generation by viscous dissipation in a dilute suspension of single-domain ferromagnetic particles in a rotating magnetic field is analyzed by assuming that the suspended particles have a high magnetic rigidity. The problem is solved by using a kinetic approach based on a rotational diffusion equation. Behavior of specific loss power (SLP) as a function of field strength H and frequency {omega} is examined at constant temperature. SLP increases as either of these parameters squared when the other is constant, eventually approaching a saturation value. The function SLP(H, {omega}) can be used to determine optimal and admissible ranges of magneticallymore » induced heating.« less
Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Cost, Thomas L.; Hofmann, Martin O.
1990-01-01
A prototype of an expert system was developed which applies qualitative or model-based reasoning to the task of post-test analysis and diagnosis of data resulting from a rocket engine firing. A combined component-based and process theory approach is adopted as the basis for system modeling. Such an approach provides a framework for explaining both normal and deviant system behavior in terms of individual component functionality. The diagnosis function is applied to digitized sensor time-histories generated during engine firings. The generic system is applicable to any liquid rocket engine but was adapted specifically in this work to the Space Shuttle Main Engine (SSME). The system is applied to idealized data resulting from turbomachinery malfunction in the SSME.
On parameters identification of computational models of vibrations during quiet standing of humans
NASA Astrophysics Data System (ADS)
Barauskas, R.; Krušinskienė, R.
2007-12-01
Vibration of the center of pressure (COP) of human body on the base of support during quiet standing is a very popular clinical research, which provides useful information about the physical and health condition of an individual. In this work, vibrations of COP of a human body in forward-backward direction during still standing are generated using controlled inverted pendulum (CIP) model with a single degree of freedom (dof) supplied with proportional, integral and differential (PID) controller, which represents the behavior of the central neural system of a human body and excited by cumulative disturbance vibration, generated within the body due to breathing or any other physical condition. The identification of the model and disturbance parameters is an important stage while creating a close-to-reality computational model able to evaluate features of disturbance. The aim of this study is to present the CIP model parameters identification approach based on the information captured by time series of the COP signal. The identification procedure is based on an error function minimization. Error function is formulated in terms of time laws of computed and experimentally measured COP vibrations. As an alternative, error function is formulated in terms of the stabilogram diffusion function (SDF). The minimization of error functions is carried out by employing methods based on sensitivity functions of the error with respect to model and excitation parameters. The sensitivity functions are obtained by using the variational techniques. The inverse dynamic problem approach has been employed in order to establish the properties of the disturbance time laws ensuring the satisfactory coincidence of measured and computed COP vibration laws. The main difficulty of the investigated problem is encountered during the model validation stage. Generally, neither the PID controller parameter set nor the disturbance time law are known in advance. In this work, an error function formulated in terms of time derivative of disturbance torque has been proposed in order to obtain PID controller parameters, as well as the reference time law of the disturbance. The disturbance torque is calculated from experimental data using the inverse dynamic approach. Experiments presented in this study revealed that vibrations of disturbance torque and PID controller parameters identified by the method may be qualified as feasible in humans. Presented approach may be easily extended to structural models with any number of dof or higher structural complexity.
Durmaz, Arda; Henderson, Tim A D; Brubaker, Douglas; Bebek, Gurkan
2017-01-01
Large scale genomics studies have generated comprehensive molecular characterization of numerous cancer types. Subtypes for many tumor types have been established; however, these classifications are based on molecular characteristics of a small gene sets with limited power to detect dysregulation at the patient level. We hypothesize that frequent graph mining of pathways to gather pathways functionally relevant to tumors can characterize tumor types and provide opportunities for personalized therapies. In this study we present an integrative omics approach to group patients based on their altered pathway characteristics and show prognostic differences within breast cancer (p < 9:57E - 10) and glioblastoma multiforme (p < 0:05) patients. We were able validate this approach in secondary RNA-Seq datasets with p < 0:05 and p < 0:01 respectively. We also performed pathway enrichment analysis to further investigate the biological relevance of dysregulated pathways. We compared our approach with network-based classifier algorithms and showed that our unsupervised approach generates more robust and biologically relevant clustering whereas previous approaches failed to report specific functions for similar patient groups or classify patients into prognostic groups. These results could serve as a means to improve prognosis for future cancer patients, and to provide opportunities for improved treatment options and personalized interventions. The proposed novel graph mining approach is able to integrate PPI networks with gene expression in a biologically sound approach and cluster patients in to clinically distinct groups. We have utilized breast cancer and glioblastoma multiforme datasets from microarray and RNA-Seq platforms and identified disease mechanisms differentiating samples. Supplementary methods, figures, tables and code are available at https://github.com/bebeklab/dysprog.
Wang, Shige; Zhu, Jingyi; Shen, Mingwu; Zhu, Meifang; Shi, Xiangyang
2014-02-12
We report a facile and general approach to using generation 2 (G2) poly(amidoamine) (PAMAM) dendrimers for simultaneous stabilization and functionalization of electrospun poly(γ-glutamic acid) nanofibers (γ-PGA NFs). In this study, uniform γ-PGA NFs with a smooth morphology were generated using electrospinning technology. In order to endow the NFs with good water stability, amine-terminated G2.NH2 PAMAM dendrimers were utilized to crosslink the γ-PGA NFs via 1-ethyl-3-(3-dimethylami-nopropyl) carbodiimide coupling chemistry. Under the optimized crosslinking conditions, G2.NH2 dendrimers partially modified with fluorescein isothiocyanate (FI) or folic acid (FA) were used to crosslink γ-PGA NFs. Our results reveal that G2.NH2-FI is able to simultaneously render the NFs with good water stability and fluorescence property, while G2.NH2-FA is able to simultaneously endow the NFs with water stability and the ability to capture FA receptor-overexpressing cancer cells in vitro via ligand-receptor interaction. With the tunable dendrimer surface chemistry, multifunctional water-stable γ-PGA-based NFs may be generated via a dendrimer crosslinking approach, thereby providing diverse applications in the areas of biosensing, tissue engineering, drug delivery, and environmental sciences.
Random field assessment of nanoscopic inhomogeneity of bone
Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu
2010-01-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128
NASA Astrophysics Data System (ADS)
Gardner, Robin P.; Xu, Libai
2009-10-01
The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.
HaloTag technology for specific and covalent labeling of fusion proteins.
Benink, Hélène A; Urh, Marjeta
2015-01-01
Appending proteins of interest to fluorescent protein tags such as GFP has revolutionized how proteins are studied in the cellular environment. Over the last few decades many varieties of fluorescent proteins have been generated, each bringing new capability to research. However, taking full advantage of standard fluorescent proteins with advanced and differential features requires significant effort on the part of the researcher. This approach necessitates that many genetic fusions be generated and confirmed to function properly in cells with the same protein of interest. To lessen this burden, a newer category of protein fusion tags termed "self-labeling protein tags" has been developed. This approach utilizes a single protein tag, the function of which can be altered by attaching various chemical moieties (fluorescent labels, affinity handles, etc.). In this way a single genetically encoded protein fusion can easily be given functional diversity and adaptability as supplied by synthetic chemistry. Here we present protein labeling methods using HaloTag technology; comprised of HaloTag protein and the collection of small molecules designed to bind it specifically and provide it with varied functionalities. For imaging purposes these small molecules, termed HaloTag ligands, contain distinct fluorophores. Due to covalent and rapid binding between HaloTag protein and its ligands, labeling is permanent and efficient. Many of these ligands have been optimized for permeability across cellular membranes allowing for live cell labeling and imaging analysis. Nonpermeable ligands have also been developed for specific labeling of surface proteins. Overall, HaloTag is a versatile technology that empowers the end user to label a protein of interest with the choice of different fluorophores while alleviating the need for generation of multiple genetic fusions.
NASA Astrophysics Data System (ADS)
Teichmann, Juliane; Nitschke, Mirko; Pette, Dagmar; Valtink, Monika; Gramm, Stefan; Härtel, Frauke V.; Noll, Thomas; Funk, Richard H. W.; Engelmann, Katrin; Werner, Carsten
2015-08-01
Two established material systems for thermally stimulated detachment of adherent cells were combined in a cross-linked polymer blend to merge favorable properties. Through this approach poly(N-isopropylacrylamide) (PNiPAAm) with its superior switching characteristic was paired with a poly(vinyl methyl ether)-based composition that allows adjusting physico-chemical and biomolecular properties in a wide range. Beyond pure PNiPAAm, the proposed thermo-responsive coating provides thickness, stiffness and swelling behavior, as well as an apposite density of reactive sites for biomolecular functionalization, as effective tuning parameters to meet specific requirements of a particular cell type regarding initial adhesion and ease of detachment. To illustrate the strength of this approach, the novel cell culture carrier was applied to generate transplantable sheets of human corneal endothelial cells (HCEC). Sheets were grown, detached, and transferred onto planar targets. Cell morphology, viability and functionality were analyzed by immunocytochemistry and determination of transepithelial electrical resistance (TEER) before and after sheet detachment and transfer. HCEC layers showed regular morphology with appropriate TEER. Cells were positive for function-associated marker proteins ZO-1, Na+/K+-ATPase, and paxillin, and extracellular matrix proteins fibronectin, laminin and collagen type IV before and after transfer. Sheet detachment and transfer did not impair cell viability. Subsequently, a potential application in ophthalmology was demonstrated by transplantation onto de-endothelialized porcine corneas in vitro. The novel thermo-responsive cell culture carrier facilitates the generation and transfer of functional HCEC sheets. This paves the way to generate tissue engineered human corneal endothelium as an alternative transplant source for endothelial keratoplasty.
Teichmann, Juliane; Nitschke, Mirko; Pette, Dagmar; Valtink, Monika; Gramm, Stefan; Härtel, Frauke V; Noll, Thomas; Funk, Richard H W; Engelmann, Katrin; Werner, Carsten
2015-08-01
Two established material systems for thermally stimulated detachment of adherent cells were combined in a cross-linked polymer blend to merge favorable properties. Through this approach poly( N -isopropylacrylamide) (PNiPAAm) with its superior switching characteristic was paired with a poly(vinyl methyl ether)-based composition that allows adjusting physico-chemical and biomolecular properties in a wide range. Beyond pure PNiPAAm, the proposed thermo-responsive coating provides thickness, stiffness and swelling behavior, as well as an apposite density of reactive sites for biomolecular functionalization, as effective tuning parameters to meet specific requirements of a particular cell type regarding initial adhesion and ease of detachment. To illustrate the strength of this approach, the novel cell culture carrier was applied to generate transplantable sheets of human corneal endothelial cells (HCEC). Sheets were grown, detached, and transferred onto planar targets. Cell morphology, viability and functionality were analyzed by immunocytochemistry and determination of transepithelial electrical resistance (TEER) before and after sheet detachment and transfer. HCEC layers showed regular morphology with appropriate TEER. Cells were positive for function-associated marker proteins ZO-1, Na + /K + -ATPase, and paxillin, and extracellular matrix proteins fibronectin, laminin and collagen type IV before and after transfer. Sheet detachment and transfer did not impair cell viability. Subsequently, a potential application in ophthalmology was demonstrated by transplantation onto de-endothelialized porcine corneas in vitro . The novel thermo-responsive cell culture carrier facilitates the generation and transfer of functional HCEC sheets. This paves the way to generate tissue engineered human corneal endothelium as an alternative transplant source for endothelial keratoplasty.
Teichmann, Juliane; Nitschke, Mirko; Pette, Dagmar; Valtink, Monika; Gramm, Stefan; Härtel, Frauke V; Noll, Thomas; Funk, Richard H W; Engelmann, Katrin; Werner, Carsten
2015-01-01
Two established material systems for thermally stimulated detachment of adherent cells were combined in a cross-linked polymer blend to merge favorable properties. Through this approach poly(N-isopropylacrylamide) (PNiPAAm) with its superior switching characteristic was paired with a poly(vinyl methyl ether)-based composition that allows adjusting physico-chemical and biomolecular properties in a wide range. Beyond pure PNiPAAm, the proposed thermo-responsive coating provides thickness, stiffness and swelling behavior, as well as an apposite density of reactive sites for biomolecular functionalization, as effective tuning parameters to meet specific requirements of a particular cell type regarding initial adhesion and ease of detachment. To illustrate the strength of this approach, the novel cell culture carrier was applied to generate transplantable sheets of human corneal endothelial cells (HCEC). Sheets were grown, detached, and transferred onto planar targets. Cell morphology, viability and functionality were analyzed by immunocytochemistry and determination of transepithelial electrical resistance (TEER) before and after sheet detachment and transfer. HCEC layers showed regular morphology with appropriate TEER. Cells were positive for function-associated marker proteins ZO-1, Na+/K+-ATPase, and paxillin, and extracellular matrix proteins fibronectin, laminin and collagen type IV before and after transfer. Sheet detachment and transfer did not impair cell viability. Subsequently, a potential application in ophthalmology was demonstrated by transplantation onto de-endothelialized porcine corneas in vitro. The novel thermo-responsive cell culture carrier facilitates the generation and transfer of functional HCEC sheets. This paves the way to generate tissue engineered human corneal endothelium as an alternative transplant source for endothelial keratoplasty. PMID:27877823
Andrade, Xavier; Aspuru-Guzik, Alán
2013-10-08
We discuss the application of graphical processing units (GPUs) to accelerate real-space density functional theory (DFT) calculations. To make our implementation efficient, we have developed a scheme to expose the data parallelism available in the DFT approach; this is applied to the different procedures required for a real-space DFT calculation. We present results for current-generation GPUs from AMD and Nvidia, which show that our scheme, implemented in the free code Octopus, can reach a sustained performance of up to 90 GFlops for a single GPU, representing a significant speed-up when compared to the CPU version of the code. Moreover, for some systems, our implementation can outperform a GPU Gaussian basis set code, showing that the real-space approach is a competitive alternative for DFT simulations on GPUs.
Reverse Genetics and High Throughput Sequencing Methodologies for Plant Functional Genomics
Ben-Amar, Anis; Daldoul, Samia; Reustle, Götz M.; Krczal, Gabriele; Mliki, Ahmed
2016-01-01
In the post-genomic era, increasingly sophisticated genetic tools are being developed with the long-term goal of understanding how the coordinated activity of genes gives rise to a complex organism. With the advent of the next generation sequencing associated with effective computational approaches, wide variety of plant species have been fully sequenced giving a wealth of data sequence information on structure and organization of plant genomes. Since thousands of gene sequences are already known, recently developed functional genomics approaches provide powerful tools to analyze plant gene functions through various gene manipulation technologies. Integration of different omics platforms along with gene annotation and computational analysis may elucidate a complete view in a system biology level. Extensive investigations on reverse genetics methodologies were deployed for assigning biological function to a specific gene or gene product. We provide here an updated overview of these high throughout strategies highlighting recent advances in the knowledge of functional genomics in plants. PMID:28217003
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Li, Haiqing; Song, Sing I; Song, Ga Young; Kim, Il
2014-02-01
Carbon nanostructures (CNSs) such as carbon nanotubes, graphene sheets, and nanodiamonds provide an important type of substrate for constructing a variety of hybrid nanomaterials. However, their intrinsic chemistry-inert surfaces make it indispensable to pre-functionalize them prior to immobilizing additional components onto their surfaces. Currently developed strategies for functionalizing CNSs include covalent and non-covalent approaches. Conventional covalent treatments often damage the structure integrity of carbon surfaces and adversely affect their physical properties. In contrast, the non-covalent approach offers a non-destructive way to modify CNSs with desired functional surfaces, while reserving their intrinsic properties. Thus far, a number of surface modifiers including aromatic compounds, small-molecular surfactants, amphiphilic polymers, and biomacromolecules have been developed to non-covalently functionalize CNS surfaces. Mediated by these surface modifiers, various functional components such as organic species and inorganic nanoparticles were further decorated onto their surfaces, resulting in versatile carbon-based hybrid nanomaterials with broad applications in chemical engineering and biomedical areas. In this review, the recent advances in the generation of such hybrid nanostructures based on non-covalently functionalized CNSs will be reviewed.
Cirera, S; Clop, A; Jacobsen, M J; Guerin, M; Lesnik, P; Jørgensen, C B; Fredholm, M; Karlskov-Mortensen, P
2018-04-01
Taste receptors (TASRs) and appetite and reward (AR) mechanisms influence eating behaviour, which in turn affects food intake and risk of obesity. In a previous study, we used next generation sequencing to identify potentially functional mutations in TASR and AR genes and found indications for genetic associations between identified variants and growth and fat deposition in a subgroup of animals (n = 38) from the UNIK resource pig population. This population was created for studying obesity and obesity-related diseases. In the present study we validated results from our previous study by investigating genetic associations between 24 selected single nucleotide variants in TASR and AR gene variants and 35 phenotypes describing obesity and metabolism in the entire UNIK population (n = 564). Fifteen variants showed significant association with specific obesity-related phenotypes after Bonferroni correction. Six of the 15 genes, namely SIM1, FOS, TAS2R4, TAS2R9, MCHR2 and LEPR, showed good correlation between known biological function and associated phenotype. We verified a genetic association between potentially functional variants in TASR/AR genes and growth/obesity and conclude that the combination of identification of potentially functional variants by next generation sequencing followed by targeted genotyping and association studies is a powerful and cost-effective approach for increasing the power of genetic association studies. © 2018 Stichting International Foundation for Animal Genetics.
KANEKO-ISHINO, Tomoko; ISHINO, Fumitoshi
2015-01-01
Mammals, including human beings, have evolved a unique viviparous reproductive system and a highly developed central nervous system. How did these unique characteristics emerge in mammalian evolution, and what kinds of changes did occur in the mammalian genomes as evolution proceeded? A key conceptual term in approaching these issues is “mammalian-specific genomic functions”, a concept covering both mammalian-specific epigenetics and genetics. Genomic imprinting and LTR retrotransposon-derived genes are reviewed as the representative, mammalian-specific genomic functions that are essential not only for the current mammalian developmental system, but also mammalian evolution itself. First, the essential roles of genomic imprinting in mammalian development, especially related to viviparous reproduction via placental function, as well as the emergence of genomic imprinting in mammalian evolution, are discussed. Second, we introduce the novel concept of “mammalian-specific traits generated by mammalian-specific genes from LTR retrotransposons”, based on the finding that LTR retrotransposons served as a critical driving force in the mammalian evolution via generating mammalian-specific genes. PMID:26666304
NASA Astrophysics Data System (ADS)
Martínez-Lucas, G.; Pérez-Díaz, J. I.; Sarasúa, J. I.; Cavazzini, G.; Pavesi, G.; Ardizzon, G.
2017-04-01
This paper presents a dynamic simulation model of a laboratory-scale pumped-storage power plant (PSPP) operating in pumping mode with variable speed. The model considers the dynamic behavior of the conduits by means of an elastic water column approach, and synthetically generates both pressure and torque pulsations that reproduce the operation of the hydraulic machine in its instability region. The pressure and torque pulsations are generated each from a different set of sinusoidal functions. These functions were calibrated from the results of a CFD model, which was in turn validated from experimental data. Simulation model results match the numerical results of the CFD model with reasonable accuracy. The pump-turbine model (the functions used to generate pressure and torque pulsations inclusive) was up-scaled by hydraulic similarity according to the design parameters of a real PSPP and included in a dynamic simulation model of the said PSPP. Preliminary conclusions on the impact of unstable operation conditions on the penstock fatigue were obtained by means of a Monte Carlo simulation-based fatigue analysis.
Tidball, Andrew M; Dang, Louis T; Glenn, Trevor W; Kilbane, Emma G; Klarr, Daniel J; Margolis, Joshua L; Uhler, Michael D; Parent, Jack M
2017-09-12
Specifically ablating genes in human induced pluripotent stem cells (iPSCs) allows for studies of gene function as well as disease mechanisms in disorders caused by loss-of-function (LOF) mutations. While techniques exist for engineering such lines, we have developed and rigorously validated a method of simultaneous iPSC reprogramming while generating CRISPR/Cas9-dependent insertions/deletions (indels). This approach allows for the efficient and rapid formation of genetic LOF human disease cell models with isogenic controls. The rate of mutagenized lines was strikingly consistent across experiments targeting four different human epileptic encephalopathy genes and a metabolic enzyme-encoding gene, and was more efficient and consistent than using CRISPR gene editing of established iPSC lines. The ability of our streamlined method to reproducibly generate heterozygous and homozygous LOF iPSC lines with passage-matched isogenic controls in a single step provides for the rapid development of LOF disease models with ideal control lines, even in the absence of patient tissue. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
2007-02-01
saponin (Calbiochem, San Diego, CA) in PBS. Results, Significance, Obstacles and Alternative Approaches: We have generated several different fluorescent...1 integrin antibody P4C10 (Life technologies ). We will conjugate the fluorescent probes to these functional blocking antibodies for live cell...characterization of the prostate-specific membrane antigen (PSMA) in tissue extracts and body fluids. Int. J. Cancer. 62:552-558. 1995. 9. Wright GL Jr
In vitro activation of the neuro-transduction mechanism in sensitive organotypic human skin model.
Martorina, Francesca; Casale, Costantino; Urciuolo, Francesco; Netti, Paolo A; Imparato, Giorgia
2017-01-01
Recent advances in tissue engineering have encouraged researchers to endeavor the production of fully functional three-dimensional (3D) thick human tissues in vitro. Here, we report the fabrication of a fully innervated human skin tissue in vitro that recapitulates and replicates skin sensory function. Previous attempts to innervate in vitro 3D skin models did not demonstrate an effective functionality of the nerve network. In our approach, we initially engineer functional human skin tissue based on fibroblast-generated dermis and differentiated epidermis; then, we promote rat dorsal root ganglion (DRG) neurons axon ingrowth in the de-novo developed tissue. Neurofilaments network infiltrates the entire native dermis extracellular matrix (ECM), as demonstrated by immunofluorescence and second harmonic generation (SHG) imaging. To prove sensing functionality of the tissue, we use topical applications of capsaicin, an agonist of transient receptor protein-vanilloid 1 (TRPV1) channel, and quantify calcium currents resulting from variations of Ca ++ concentration in DRG neurons innervating our model. Calcium currents generation demonstrates functional cross-talking between dermis and epidermis compartments. Moreover, through a computational fluid dynamic (CFD) analysis, we set fluid dynamic conditions for a non-planar skin equivalent growth, as proof of potential application in creating skin grafts tailored on-demand for in vivo wound shape. Copyright © 2016 Elsevier Ltd. All rights reserved.
Beam Conditioning and Harmonic Generation in Free ElectronLasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charman, A.E.; Penn, G.; Wolski, A.
2004-07-05
The next generation of large-scale free-electron lasers (FELs) such as Euro-XFEL and LCLS are to be devices which produce coherent X-rays using Self-Amplified Spontaneous Emission (SASE). The performance of these devices is limited by the spread in longitudinal velocities of the beam. In the case where this spread arises primarily from large transverse oscillation amplitudes, beam conditioning can significantly enhance FEL performance. Future X-ray sources may also exploit harmonic generation starting from laser-seeded modulation. Preliminary analysis of such devices is discussed, based on a novel trial-function/variational-principle approach, which shows good agreement with more lengthy numerical simulations.
On estimating the effects of clock instability with flicker noise characteristics
NASA Technical Reports Server (NTRS)
Wu, S. C.
1981-01-01
A scheme for flicker noise generation is given. The second approach is that of successive segmentation: A clock fluctuation is represented by 2N piecewise linear segments and then converted into a summation of N+1 triangular pulse train functions. The statistics of the clock instability are then formulated in terms of two sample variances at N+1 specified averaging times. The summation converges very rapidly that a value of N 6 is seldom necessary. An application to radio interferometric geodesy shows excellent agreement between the two approaches. Limitations to and the relative merits of the two approaches are discussed.
Optical Imaging and Radiometric Modeling and Simulation
NASA Technical Reports Server (NTRS)
Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.
2010-01-01
OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).
NASA Astrophysics Data System (ADS)
Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano
2015-04-01
The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.
Kim, Su Kyoung; Kirchner, Elsa Andrea; Stefes, Arne; Kirchner, Frank
2017-12-14
Reinforcement learning (RL) enables robots to learn its optimal behavioral strategy in dynamic environments based on feedback. Explicit human feedback during robot RL is advantageous, since an explicit reward function can be easily adapted. However, it is very demanding and tiresome for a human to continuously and explicitly generate feedback. Therefore, the development of implicit approaches is of high relevance. In this paper, we used an error-related potential (ErrP), an event-related activity in the human electroencephalogram (EEG), as an intrinsically generated implicit feedback (rewards) for RL. Initially we validated our approach with seven subjects in a simulated robot learning scenario. ErrPs were detected online in single trial with a balanced accuracy (bACC) of 91%, which was sufficient to learn to recognize gestures and the correct mapping between human gestures and robot actions in parallel. Finally, we validated our approach in a real robot scenario, in which seven subjects freely chose gestures and the real robot correctly learned the mapping between gestures and actions (ErrP detection (90% bACC)). In this paper, we demonstrated that intrinsically generated EEG-based human feedback in RL can successfully be used to implicitly improve gesture-based robot control during human-robot interaction. We call our approach intrinsic interactive RL.
Investigation of MM-PBSA rescoring of docking poses.
Thompson, David C; Humblet, Christine; Joseph-McCarthy, Diane
2008-05-01
Target-based virtual screening is increasingly used to generate leads for targets for which high quality three-dimensional (3D) structures are available. To allow large molecular databases to be screened rapidly, a tiered scoring scheme is often employed whereby a simple scoring function is used as a fast filter of the entire database and a more rigorous and time-consuming scoring function is used to rescore the top hits to produce the final list of ranked compounds. Molecular mechanics Poisson-Boltzmann surface area (MM-PBSA) approaches are currently thought to be quite effective at incorporating implicit solvation into the estimation of ligand binding free energies. In this paper, the ability of a high-throughput MM-PBSA rescoring function to discriminate between correct and incorrect docking poses is investigated in detail. Various initial scoring functions are used to generate docked poses for a subset of the CCDC/Astex test set and to dock one set of actives/inactives from the DUD data set. The effectiveness of each of these initial scoring functions is discussed. Overall, the ability of the MM-PBSA rescoring function to (i) regenerate the set of X-ray complexes when docking the bound conformation of the ligand, (ii) regenerate the X-ray complexes when docking conformationally expanded databases for each ligand which include "conformation decoys" of the ligand, and (iii) enrich known actives in a virtual screen for the mineralocorticoid receptor in the presence of "ligand decoys" is assessed. While a pharmacophore-based molecular docking approach, PhDock, is used to carry out the docking, the results are expected to be general to use with any docking method.
NASA Astrophysics Data System (ADS)
Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.
2016-12-01
Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.
Zhang, Nan; Membreno, Edward; Raj, Susan; Zhang, Hongjie; Khan, Liakot A; Gobel, Verena
2017-10-03
The four C. elegans excretory canals are narrow tubes extended through the length of the animal from a single cell, with almost equally far extended intracellular endotubes that build and stabilize the lumen with a membrane and submembraneous cytoskeleton of apical character. The excretory cell expands its length approximately 2,000 times to generate these canals, making this model unique for the in vivo assessment of de novo polarized membrane biogenesis, intracellular lumen morphogenesis and unicellular tubulogenesis. The protocol presented here shows how to combine standard labeling, gain- and loss-of-function genetic or RNA interference (RNAi)-, and microscopic approaches to use this model to visually dissect and functionally analyze these processes on a molecular level. As an example of a labeling approach, the protocol outlines the generation of transgenic animals with fluorescent fusion proteins for live analysis of tubulogenesis. As an example of a genetic approach, it highlights key points of a visual RNAi-based interaction screen designed to modify a gain-of-function cystic canal phenotype. The specific methods described are how to: label and visualize the canals by expressing fluorescent proteins; construct a targeted RNAi library and strategize RNAi screening for the molecular analysis of canal morphogenesis; visually assess modifications of canal phenotypes; score them by dissecting fluorescence microscopy; characterize subcellular canal components at higher resolution by confocal microscopy; and quantify visual parameters. The approach is useful for the investigator who is interested in taking advantage of the C. elegans excretory canal for identifying and characterizing genes involved in the phylogenetically conserved processes of intracellular lumen and unicellular tube morphogenesis.
Multi-paradigm simulation at nanoscale: Methodology and application to functional carbon material
NASA Astrophysics Data System (ADS)
Su, Haibin
2012-12-01
Multiparadigm methods to span the scales from quantum mechanics to practical issues of functional nanoassembly and nanofabrication are enabling first principles predictions to guide and complement the experimental developments by designing and optimizing computationally the materials compositions and structures to assemble nanoscale systems with the requisite properties. In this talk, we employ multi-paradigm approaches to investigate functional carbon materials with versatile character, including fullerene, carbon nanotube (CNT), graphene, and related hybrid structures, which have already created an enormous impact on next generation nano devices. The topics will cover the reaction dynamics of C60 dimerization and the more challenging complex tubular fullerene formation process in the peapod structures; the computational design of a new generation of peapod nano-oscillators, the predicted magnetic state in Nano Buds; opto-electronic properties of graphene nanoribbons; and disorder / vibronic effects on transport in carbonrich materials.
RNA circularization strategies in vivo and in vitro
Petkovic, Sonja; Müller, Sabine
2015-01-01
In the plenitude of naturally occurring RNAs, circular RNAs (circRNAs) and their biological role were underestimated for years. However, circRNAs are ubiquitous in all domains of life, including eukaryotes, archaea, bacteria and viruses, where they can fulfill diverse biological functions. Some of those functions, as for example playing a role in the life cycle of viral and viroid genomes or in the maturation of tRNA genes, have been elucidated; other putative functions still remain elusive. Due to the resistance to exonucleases, circRNAs are promising tools for in vivo application as aptamers, trans-cleaving ribozymes or siRNAs. How are circRNAs generated in vivo and what approaches do exist to produce ring-shaped RNAs in vitro? In this review we illustrate the occurrence and mechanisms of RNA circularization in vivo, survey methods for the generation of circRNA in vitro and provide appropriate protocols. PMID:25662225
Generation of high-yield insulin producing cells from human bone marrow mesenchymal stem cells.
Jafarian, Arefeh; Taghikhani, Mohammad; Abroun, Saeid; Pourpak, Zahra; Allahverdi, Amir; Soleimani, Masoud
2014-07-01
Allogenic islet transplantation is a most efficient approach for treatment of diabetes mellitus. However, the scarcity of islets and long term need for an immunosuppressant limits its application. Recently, cell replacement therapies that generate of unlimited sources of β cells have been developed to overcome these limitations. In this study we have described a stage specific differentiation protocol for the generation of insulin producing islet-like clusters from human bone marrow mesenchymal stem cells (hBM-MSCs). This specific stepwise protocol induced differentiation of hMSCs into definitive endoderm, pancreatic endoderm and pancreatic endocrine cells that expressed of sox17, foxa2, pdx1, ngn3, nkx2.2, insulin, glucagon, somatostatin, pancreatic polypeptide, and glut2 transcripts respectively. In addition, immunocytochemical analysis confirmed protein expression of the above mentioned genes. Western blot analysis discriminated insulin from proinsulin in the final differentiated cells. In derived insulin producing cells (IPCs), secreted insulin and C-peptide was in a glucose dependent manner. We have developed a protocol that generates effective high-yield human IPCs from hBM-MSCs in vitro. These finding suggest that functional IPCs generated by this procedure can be used as a cell-based approach for insulin dependent diabetes mellitus.
Effect of Display Color on Pilot Performance and Describing Functions
NASA Technical Reports Server (NTRS)
Chase, Wendell D.
1997-01-01
A study has been conducted with the full-spectrum, calligraphic, computer-generated display system to determine the effect of chromatic content of the visual display upon pilot performance during the landing approach maneuver. This study utilizes a new digital chromatic display system, which has previously been shown to improve the perceived fidelity of out-the-window display scenes, and presents the results of an experiment designed to determine the effects of display color content by the measurement of both vertical approach performance and pilot-describing functions. This method was selected to more fully explore the effects of visual color cues used by the pilot. Two types of landing approaches were made: dynamic and frozen range, with either a landing approach scene or a perspective array display. The landing approach scene was presented with either red runway lights and blue taxiway lights or with the colors reversed, and the perspective array with red lights, blue lights, or red and blue lights combined. The vertical performance measures obtained in this experiment indicated that the pilots performed best with the blue and red/blue displays. and worst with the red displays. The describing-function system analysis showed more variation with the red displays. The crossover frequencies were lowest with the red displays and highest with the combined red/blue displays, which provided the best overall tracking, performance. Describing-function performance measures, vertical performance measures, and pilot opinion support the hypothesis that specific colors in displays can influence the pilots' control characteristics during the final approach.
Identification of Flood Reactivity Regions via the Functional Clustering of Hydrographs
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Viviroli, Daniel; Furrer, Reinhard; Seibert, Jan; Favre, Anne-Catherine
2018-03-01
Flood hydrograph shapes contain valuable information on the flood-generation mechanisms of a catchment. To make good use of this information, we express flood hydrograph shapes as continuous functions using a functional data approach. We propose a clustering approach based on functional data for flood hydrograph shapes to identify a set of representative hydrograph shapes on a catchment scale and use these catchment-specific sets of representative hydrographs to establish regions of catchments with similar flood reactivity on a regional scale. We applied this approach to flood samples of 163 medium-size Swiss catchments. The results indicate that three representative hydrograph shapes sufficiently describe the hydrograph shape variability within a catchment and therefore can be used as a proxy for the flood behavior of a catchment. These catchment-specific sets of three hydrographs were used to group the catchments into three reactivity regions of similar flood behavior. These regions were not only characterized by similar hydrograph shapes and reactivity but also by event magnitudes and triggering event conditions. We envision these regions to be useful in regionalization studies, regional flood frequency analyses, and to allow for the construction of synthetic design hydrographs in ungauged catchments. The clustering approach based on functional data which establish these regions is very flexible and has the potential to be extended to other geographical regions or toward the use in climate impact studies.
Wang, Tiancai; He, Xing; Huang, Tingwen; Li, Chuandong; Zhang, Wei
2017-09-01
The economic emission dispatch (EED) problem aims to control generation cost and reduce the impact of waste gas on the environment. It has multiple constraints and nonconvex objectives. To solve it, the collective neurodynamic optimization (CNO) method, which combines heuristic approach and projection neural network (PNN), is attempted to optimize scheduling of an electrical microgrid with ten thermal generators and minimize the plus of generation and emission cost. As the objective function has non-derivative points considering valve point effect (VPE), differential inclusion approach is employed in the PNN model introduced to deal with them. Under certain conditions, the local optimality and convergence of the dynamic model for the optimization problem is analyzed. The capability of the algorithm is verified in a complicated situation, where transmission loss and prohibited operating zones are considered. In addition, the dynamic variation of load power at demand side is considered and the optimal scheduling of generators within 24 h is described. Copyright © 2017 Elsevier Ltd. All rights reserved.
On the ``Matrix Approach'' to Interacting Particle Systems
NASA Astrophysics Data System (ADS)
de Sanctis, L.; Isopi, M.
2004-04-01
Derrida et al. and Schütz and Stinchcombe gave algebraic formulas for the correlation functions of the partially asymmetric simple exclusion process. Here we give a fairly general recipe of how to get these formulas and extend them to the whole time evolution (starting from the generator of the process), for a certain class of interacting systems. We then analyze the algebraic relations obtained to show that the matrix approach does not work with some models such as the voter and the contact processes.
An integrate-over-temperature approach for enhanced sampling.
Gao, Yi Qin
2008-02-14
A simple method is introduced to achieve efficient random walking in the energy space in molecular dynamics simulations which thus enhances the sampling over a large energy range. The approach is closely related to multicanonical and replica exchange simulation methods in that it allows configurations of the system to be sampled in a wide energy range by making use of Boltzmann distribution functions at multiple temperatures. A biased potential is quickly generated using this method and is then used in accelerated molecular dynamics simulations.
Flexible substrata for the detection of cellular traction forces
NASA Technical Reports Server (NTRS)
Beningo, Karen A.; Wang, Yu-Li
2002-01-01
By modulating adhesion signaling and cytoskeletal organization, mechanical forces play an important role in various cellular functions, from propelling cell migration to mediating communication between cells. Recent developments have resulted in several new approaches for the detection, analysis and visualization of mechanical forces generated by cultured cells. Combining these methods with other approaches, such as green-fluorescent protein (GFP) imaging and gene manipulation, proves to be particularly powerful for analyzing the interplay between extracellular physical forces and intracellular chemical events.
Contemporary Approaches to Modulating the Nitric Oxide-cGMP Pathway in Cardiovascular Disease.
Kraehling, Jan R; Sessa, William C
2017-03-31
Endothelial cells lining the vessel wall control important aspects of vascular homeostasis. In particular, the production of endothelium-derived nitric oxide and activation of soluble guanylate cyclase promotes endothelial quiescence and governs vasomotor function and proportional remodeling of blood vessels. Here, we discuss novel approaches to improve endothelial nitric oxide generation and preserve its bioavailability. We also discuss therapeutic opportunities aimed at activation of soluble guanylate cyclase for multiple cardiovascular indications. © 2017 American Heart Association, Inc.
Mingo, Janire; Erramuzpe, Asier; Luna, Sandra; Aurtenetxe, Olaia; Amo, Laura; Diez, Ibai; Schepens, Jan T. G.; Hendriks, Wiljan J. A. J.; Cortés, Jesús M.; Pulido, Rafael
2016-01-01
Site-directed mutagenesis (SDM) is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates comprehensive collections of amino acid substitution variants, including scanning- and single site-multiple mutations. The approach combines unified mutagenic primer design with the mixing of multiple distinct primer pairs and/or plasmid templates to increase the yield of a single inverse-PCR mutagenesis reaction. Also, a user-friendly program for automatic design of standardized primers for Ala-scanning mutagenesis is made available. Experimental results were compared with a modeling approach together with stochastic simulation data. For single site-multiple mutagenesis purposes and for simultaneous mutagenesis in different plasmid backgrounds, combination of primer sets and/or plasmid templates in a single reaction tube yielded the distinct mutations in a stochastic fashion. For scanning mutagenesis, we found that a combination of overlapping primer sets in a single PCR reaction allowed the yield of different individual mutations, although this yield did not necessarily follow a stochastic trend. Double mutants were generated when the overlap of primer pairs was below 60%. Our results illustrate that one-tube-only SDM effectively reduces the number of reactions required in large-scale mutagenesis strategies, facilitating the generation of comprehensive collections of protein variants suitable for functional analysis. PMID:27548698
Regenerating the human heart: direct reprogramming strategies and their current limitations.
Ghiroldi, Andrea; Piccoli, Marco; Ciconte, Giuseppe; Pappone, Carlo; Anastasia, Luigi
2017-10-27
Cardiovascular diseases are the leading cause of death in the Western world. Unfortunately, current therapies are often only palliative, consequently essentially making heart transplantation necessary for many patients. However, several novel therapeutic approaches in the past two decades have yielded quite encouraging results. The generation of induced pluripotent stem cells, through the forced expression of stem cell-specific transcription factors, has inspired the most promising strategies for heart regeneration by direct reprogramming of cardiac fibroblasts into functional cardiomyocytes. Initial attempts at this reprogramming were conducted using a similar approach to the one used with transcription factors, but during years, novel strategies have been tested, e.g., miRNAs, recombinant proteins and chemical molecules. Although preliminary results on animal models are promising, the low reprogramming efficiency, as well as the incomplete maturation of the cardiomyocytes, still represents important obstacles. This review covers direct transdifferentiation strategies that have been proposed and developed and illustrates the pros and cons of each approach. Indeed, as described in the manuscript, there are still many unanswered questions and drawbacks that require a better understanding of the basic signaling pathways and transcription factor networks before functional cells, suitable for cardiac regeneration and safe for the patients, can be generated and used for human therapies.
Wang, Hao; Liu, Kan; Chen, Kuan-Ju; Lu, Yujie; Wang, Shutao; Lin, Wei-Yu; Guo, Feng; Kamei, Ken-ichiro; Chen, Yi-Chun; Ohashi, Minori; Wang, Mingwei; Garcia, Mitch André; Zhao, Xing-Zhong; Shen, Clifton K.-F.; Tseng, Hsian-Rong
2010-01-01
Nanoparticles are regarded as promising transfection reagents for effective and safe delivery of nucleic acids into specific type of cells or tissues providing an alternative manipulation/therapy strategy to viral gene delivery. However, the current process of searching novel delivery materials is limited due to conventional low-throughput and time-consuming multistep synthetic approaches. Additionally, conventional approaches are frequently accompanied with unpredictability and continual optimization refinements, impeding flexible generation of material diversity creating a major obstacle to achieving high transfection performance. Here we have demonstrated a rapid developmental pathway toward highly efficient gene delivery systems by leveraging the powers of a supramolecular synthetic approach and a custom-designed digital microreactor. Using the digital microreactor, broad structural/functional diversity can be programmed into a library of DNA-encapsulated supramolecular nanoparticles (DNA⊂SNPs) by systematically altering the mixing ratios of molecular building blocks and a DNA plasmid. In vitro transfection studies with DNA⊂SNPs library identified the DNA⊂SNPs with the highest gene transfection efficiency, which can be attributed to cooperative effects of structures and surface chemistry of DNA⊂SNPs. We envision such a rapid developmental pathway can be adopted for generating nanoparticle-based vectors for delivery of a variety of loads. PMID:20925389
Gajos, Katarzyna; Kamińska, Agnieszka; Awsiuk, Kamil; Bajor, Adrianna; Gruszczyński, Krzysztof; Pawlak, Anna; Żądło, Andrzej; Kowalik, Artur; Budkowski, Andrzej; Stępień, Ewa
2017-02-01
Among the various biomarkers that are used to diagnose or monitor disease, extracellular vesicles (EVs) represent one of the most promising targets in the development of new therapeutic strategies and the application of new diagnostic methods. The detection of circulating platelet-derived microvesicles (PMVs) is a considerable challenge for laboratory diagnostics, especially in the preliminary phase of a disease. In this study, we present a multistep approach to immobilizing and detecting PMVs in biological samples (microvesicles generated from activated platelets and human platelet-poor plasma) on functionalized silicon substrate. We describe the application of time-of-flight secondary ion mass spectrometry (TOF-SIMS) and spectroscopic ellipsometry methods to the detection of immobilized PMVs in the context of a novel imaging flow cytometry (ISX) technique and atomic force microscopy (AFM). This novel approach allowed us to confirm the presence of the abundant microvesicle phospholipids phosphatidylserine (PS) and phosphatidylethanolamine (PE) on a surface with immobilized PMVs. Phosphatidylcholine groups (C 5 H 12 N + ; C 5 H 15 PNO 4 + ) were also detected. Moreover, we were able to show that ellipsometry permitted the immobilization of PMVs on a functionalized surface to be evaluated. The sensitivity of the ISX technique depends on the size and refractive index of the analyzed microvesicles. Graphical abstract Human platelets activated with thrombin (in concentration 1IU/mL) generate population of PMVs (platelet derived microvesicles), which can be detected and enumerated with fluorescent-label method (imaging cytometry). Alternatively, PMVs can be immobilized on the modified silicon substrate which is functionalized with a specific IgM murine monoclonal antibody against human glycoprotein IIb/IIIa complex (PAC-1). Immobilized PMVs can be subjected to label-free analyses by means ellipsometry, atomic force microscopy (AFM) and time-of-flight secondary ion mass spectrometry (TOF-SIMS).
NASA Astrophysics Data System (ADS)
Saint-Drenan, Yves-Marie; Wald, Lucien; Ranchin, Thierry; Dubus, Laurent; Troccoli, Alberto
2018-05-01
Classical approaches to the calculation of the photovoltaic (PV) power generated in a region from meteorological data require the knowledge of the detailed characteristics of the plants, which are most often not publicly available. An approach is proposed with the objective to obtain the best possible assessment of power generated in any region without having to collect detailed information on PV plants. The proposed approach is based on a model of PV plant coupled with a statistical distribution of the prominent characteristics of the configuration of the plant and is tested over Europe. The generated PV power is first calculated for each of the plant configurations frequently found in a given region and then aggregated taking into account the probability of occurrence of each configuration. A statistical distribution has been constructed from detailed information obtained for several thousands of PV plants representing approximately 2 % of the total number of PV plants in Germany and was then adapted to other European countries by taking into account changes in the optimal PV tilt angle as a function of the latitude and meteorological conditions. The model has been run with bias-adjusted ERA-interim data as meteorological inputs. The results have been compared to estimates of the total PV power generated in two countries: France and Germany, as provided by the corresponding transmission system operators. Relative RMSE of 4.2 and 3.8 % and relative biases of -2.4 and 0.1 % were found with three-hourly data for France and Germany. A validation against estimates of the country-wide PV-power generation provided by the ENTSO-E for 16 European countries has also been conducted. This evaluation is made difficult by the uncertainty on the installed capacity corresponding to the ENTSO-E data but it nevertheless allows demonstrating that the model output and TSO data are highly correlated in most countries. Given the simplicity of the proposed approach these results are very encouraging. The approach is particularly suited to climatic timescales, both historical and future climates, as demonstrated here.
Crustal Properties Across the Mid-Continent Rift via Transfer Function Analysis
NASA Astrophysics Data System (ADS)
Frederiksen, A. W.; Tyomkin, Y.; Campbell, R.; van der Lee, S.; Zhang, H.
2015-12-01
The Mid-Continent Rift (MCR), a failed Proterozoic rift structure in central North America, is a dominant feature of North American gravity maps. The rift underwent a combination of extension, magmatism, and later compression, and it is difficult to predict how these events affected the overall crustal thickness and bulk composition in the vicinity of the rift axis, though the associated gravity high indicates that large-volume mafic magmatism took place. The Superior Province Rifting Earthscope Experiment (SPREE) project instrumented the MCR with Flexible Array broadband seismographs from 2011 through 2013 in Minnesota and Wisconsin, along two lines crossing the rift axis as well as a line following the axis. We examine teleseismic P-coda data from SPREE and nearby Transportable Array instruments using a new technique: transfer-function analysis. In this approach, possible models of crustal structure are used to generate a predicted transfer function relating the radial and vertical components of the P coda at a particular site. The transfer function then allows generation of a misfit (between the true radial component and a synthetic radial component predicted from the vertical trace) without the need to perform receiver-function deconvolution, thus avoiding the deconvolution problems encountered with receiver functions in sedimentary basins. We use the transfer-function approach to perform a grid search over three crustal properties: crustal thickness, crustal P/S velocity ratio, and the thickness of an overlying sedimentary basin. Results for our SPREE/TA data set indicate that the crust is significantly thickened along the rift axis, with maximum thicknesses approaching 50 km; the crust is thinner (ca. 40 km) outside of the rift zone. The crustal thickness structure is particularly complex beneath southeastern Minnesota, where very strong Moho topography is present, as well as up to 2 km of sediment; further north, the Moho is smoother and the basin is not present. P/S ratio varies along the rift axis, suggesting a higher mafic component (higher ratio) in southern Minnesota. The complexity we see along the MCR is consistent with the results obtained by Zhang et al. (this conference) using receiver function analysis.
NASA Technical Reports Server (NTRS)
Litvin, Faydor L.; Lee, Hong-Tao
1989-01-01
A new approach for determination of machine-tool settings for spiral bevel gears is proposed. The proposed settings provide a predesigned parabolic function of transmission errors and the desired location and orientation of the bearing contact. The predesigned parabolic function of transmission errors is able to absorb piece-wise linear functions of transmission errors that are caused by the gear misalignment and reduce gear noise. The gears are face-milled by head cutters with conical surfaces or surfaces of revolution. A computer program for simulation of meshing, bearing contact and determination of transmission errors for misaligned gear has been developed.
Criteria for scaling heat exchangers to miniature size
NASA Technical Reports Server (NTRS)
Rudolfvonrohr, P. B.; Smith, J. L., Jr.
1985-01-01
The purpose of this work is to highlight the particular aspects of miniature heat exchangers performance and to determine an appropriate design approach. A thermodynamic analysis is performed to express the generated entropy as a function of material and geometric characteristics of the heat exchangers. This expression is then used to size miniature heat exchangers.
Technical Change in the North American Forestry Sector: A Review
Jeffery C. Stier; David N. Bengston
1992-01-01
Economists have examined the impact of technical change on the forest products sector using the historical, index number, and econometric approaches. This paper reviews econometric analyses of the rate and bias of technical change, examining functional form, factors included, and empirical results. Studies are classified as first- second-, or third-generation...
Pacific Yew: A Facultative Riparian Conifer with an Uncertain Future
Stanley Scher; Bert Schwarzschild
1989-01-01
Increasing demands for Pacific yew bark, a source of an anticancer agent, have generated interest in defining the yew resource and in exploring strategies to conserve this species. The distribution, riparian requirements and ecosystem functions of yew populations in coastal and inland forests of northern California are outlined and alternative approaches to conserving...
Open Source Solutions for Libraries: ABCD vs Koha
ERIC Educational Resources Information Center
Macan, Bojan; Fernandez, Gladys Vanesa; Stojanovski, Jadranka
2013-01-01
Purpose: The purpose of this study is to present an overview of the two open source (OS) integrated library systems (ILS)--Koha and ABCD (ISIS family), to compare their "next-generation library catalog" functionalities, and to give comparison of other important features available through ILS modules. Design/methodology/approach: Two open source…
A brief account of greener production of nanoparticles which reduces or eliminates the use and generation of hazardous substances is presented. The utility of vitamins B1 and B2, which can function both as reducing and capping agents, provides an extremely s...
ERIC Educational Resources Information Center
Lee, Jun-Ki; Kwon, Yong-Ju
2011-01-01
Using functional magnetic resonance imaging (fMRI), this study investigates and discusses neurological explanations for, and the educational implications of, the neural network activations involved in hypothesis-generating and hypothesis-understanding for biology education. Two sets of task paradigms about biological phenomena were designed:…
An application of probability to combinatorics: a proof of Vandermonde identity
NASA Astrophysics Data System (ADS)
Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni
2017-08-01
In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using 'induction', 'polynomial identities' nor 'generating functions', and will give a proof of the 'Vandermonde Identity' using elementary notions of probability.
ERIC Educational Resources Information Center
de Koning, Bjorn B.; Tabbers, Huib K.; Rikers, Remy M. J. P.; Paas, Fred
2010-01-01
This study investigated whether learners construct more accurate mental representations from animations when instructional explanations are provided via narration than when learners attempt to infer functional relations from the animation through self-explaining. Also effects of attention guidance by means of cueing are investigated. Psychology…
Kroonblawd, Matthew P; Pietrucci, Fabio; Saitta, Antonino Marco; Goldman, Nir
2018-04-10
We demonstrate the capability of creating robust density functional tight binding (DFTB) models for chemical reactivity in prebiotic mixtures through force matching to short time scale quantum free energy estimates. Molecular dynamics using density functional theory (DFT) is a highly accurate approach to generate free energy surfaces for chemical reactions, but the extreme computational cost often limits the time scales and range of thermodynamic states that can feasibly be studied. In contrast, DFTB is a semiempirical quantum method that affords up to a thousandfold reduction in cost and can recover DFT-level accuracy. Here, we show that a force-matched DFTB model for aqueous glycine condensation reactions yields free energy surfaces that are consistent with experimental observations of reaction energetics. Convergence analysis reveals that multiple nanoseconds of combined trajectory are needed to reach a steady-fluctuating free energy estimate for glycine condensation. Predictive accuracy of force-matched DFTB is demonstrated by direct comparison to DFT, with the two approaches yielding surfaces with large regions that differ by only a few kcal mol -1 .
Kroonblawd, Matthew P.; Pietrucci, Fabio; Saitta, Antonino Marco; ...
2018-03-15
Here, we demonstrate the capability of creating robust density functional tight binding (DFTB) models for chemical reactivity in prebiotic mixtures through force matching to short time scale quantum free energy estimates. Molecular dynamics using density functional theory (DFT) is a highly accurate approach to generate free energy surfaces for chemical reactions, but the extreme computational cost often limits the time scales and range of thermodynamic states that can feasibly be studied. In contrast, DFTB is a semiempirical quantum method that affords up to a thousandfold reduction in cost and can recover DFT-level accuracy. Here, we show that a force-matched DFTBmore » model for aqueous glycine condensation reactions yields free energy surfaces that are consistent with experimental observations of reaction energetics. Convergence analysis reveals that multiple nanoseconds of combined trajectory are needed to reach a steady-fluctuating free energy estimate for glycine condensation. Predictive accuracy of force-matched DFTB is demonstrated by direct comparison to DFT, with the two approaches yielding surfaces with large regions that differ by only a few kcal mol –1.« less
Pareto-Optimal Multi-objective Inversion of Geophysical Data
NASA Astrophysics Data System (ADS)
Schnaidt, Sebastian; Conway, Dennis; Krieger, Lars; Heinson, Graham
2018-01-01
In the process of modelling geophysical properties, jointly inverting different data sets can greatly improve model results, provided that the data sets are compatible, i.e., sensitive to similar features. Such a joint inversion requires a relationship between the different data sets, which can either be analytic or structural. Classically, the joint problem is expressed as a scalar objective function that combines the misfit functions of multiple data sets and a joint term which accounts for the assumed connection between the data sets. This approach suffers from two major disadvantages: first, it can be difficult to assess the compatibility of the data sets and second, the aggregation of misfit terms introduces a weighting of the data sets. We present a pareto-optimal multi-objective joint inversion approach based on an existing genetic algorithm. The algorithm treats each data set as a separate objective, avoiding forced weighting and generating curves of the trade-off between the different objectives. These curves are analysed by their shape and evolution to evaluate data set compatibility. Furthermore, the statistical analysis of the generated solution population provides valuable estimates of model uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroonblawd, Matthew P.; Pietrucci, Fabio; Saitta, Antonino Marco
Here, we demonstrate the capability of creating robust density functional tight binding (DFTB) models for chemical reactivity in prebiotic mixtures through force matching to short time scale quantum free energy estimates. Molecular dynamics using density functional theory (DFT) is a highly accurate approach to generate free energy surfaces for chemical reactions, but the extreme computational cost often limits the time scales and range of thermodynamic states that can feasibly be studied. In contrast, DFTB is a semiempirical quantum method that affords up to a thousandfold reduction in cost and can recover DFT-level accuracy. Here, we show that a force-matched DFTBmore » model for aqueous glycine condensation reactions yields free energy surfaces that are consistent with experimental observations of reaction energetics. Convergence analysis reveals that multiple nanoseconds of combined trajectory are needed to reach a steady-fluctuating free energy estimate for glycine condensation. Predictive accuracy of force-matched DFTB is demonstrated by direct comparison to DFT, with the two approaches yielding surfaces with large regions that differ by only a few kcal mol –1.« less
MOCASSIN-prot: a multi-objective clustering approach for protein similarity networks.
Keel, Brittney N; Deng, Bo; Moriyama, Etsuko N
2018-04-15
Proteins often include multiple conserved domains. Various evolutionary events including duplication and loss of domains, domain shuffling, as well as sequence divergence contribute to generating complexities in protein structures, and consequently, in their functions. The evolutionary history of proteins is hence best modeled through networks that incorporate information both from the sequence divergence and the domain content. Here, a game-theoretic approach proposed for protein network construction is adapted into the framework of multi-objective optimization, and extended to incorporate clustering refinement procedure. The new method, MOCASSIN-prot, was applied to cluster multi-domain proteins from ten genomes. The performance of MOCASSIN-prot was compared against two protein clustering methods, Markov clustering (TRIBE-MCL) and spectral clustering (SCPS). We showed that compared to these two methods, MOCASSIN-prot, which uses both domain composition and quantitative sequence similarity information, generates fewer false positives. It achieves more functionally coherent protein clusters and better differentiates protein families. MOCASSIN-prot, implemented in Perl and Matlab, is freely available at http://bioinfolab.unl.edu/emlab/MOCASSINprot. emoriyama2@unl.edu. Supplementary data are available at Bioinformatics online.
Statistical methods for convergence detection of multi-objective evolutionary algorithms.
Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J
2009-01-01
In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.
Chen, Rui-Pin; Chen, Zhaozhong; Chew, Khian-Hooi; Li, Pei-Gang; Yu, Zhongliang; Ding, Jianping; He, Sailing
2015-05-29
A caustic vector vortex optical field is experimentally generated and demonstrated by a caustic-based approach. The desired caustic with arbitrary acceleration trajectories, as well as the structured states of polarization (SoP) and vortex orders located in different positions in the field cross-section, is generated by imposing the corresponding spatial phase function in a vector vortex optical field. Our study reveals that different spin and orbital angular momentum flux distributions (including opposite directions) in different positions in the cross-section of a caustic vector vortex optical field can be dynamically managed during propagation by intentionally choosing the initial polarization and vortex topological charges, as a result of the modulation of the caustic phase. We find that the SoP in the field cross-section rotates during propagation due to the existence of the vortex. The unique structured feature of the caustic vector vortex optical field opens the possibility of multi-manipulation of optical angular momentum fluxes and SoP, leading to more complex manipulation of the optical field scenarios. Thus this approach further expands the functionality of an optical system.
Use of Phage Display to Generate Conformation-Sensor Recombinant Antibodies
Haque, Aftabul; Tonks, Nicholas K.
2013-01-01
We describe a phage display approach that we have previously used to generate conformation-sensor antibodies that recognize specifically and stabilize the oxidized, inactive conformation of protein tyrosine phosphatase 1B (PTP1B). We use a solution-based panning and screening strategy conducted in the presence of reduced active PTP1B, which enriches antibodies to epitopes unique to the oxidized form, while excluding antibodies that recognize epitopes common to oxidized and reduced forms of PTP1B. This strategy avoids conventional solid-phase immobilization, with its inherent potential for denaturation of the antigen. In addition, a functional screening strategy selects scFvs directly for their capacity for both specific binding and stabilization of the target enzyme in its inactive conformation. These conformation-specific scFvs illustrate that stabilization of oxidized PTP1B is an effective strategy to inhibit PTP1B function; it is possible that this approach may be applicable to the PTP family as a whole. Using this protocol, isolation and characterization of specific scFvs from immune responsive animals should take ~6 weeks. PMID:23154784
Akhter, Nasrin; Shehu, Amarda
2018-01-19
Due to the essential role that the three-dimensional conformation of a protein plays in regulating interactions with molecular partners, wet and dry laboratories seek biologically-active conformations of a protein to decode its function. Computational approaches are gaining prominence due to the labor and cost demands of wet laboratory investigations. Template-free methods can now compute thousands of conformations known as decoys, but selecting native conformations from the generated decoys remains challenging. Repeatedly, research has shown that the protein energy functions whose minima are sought in the generation of decoys are unreliable indicators of nativeness. The prevalent approach ignores energy altogether and clusters decoys by conformational similarity. Complementary recent efforts design protein-specific scoring functions or train machine learning models on labeled decoys. In this paper, we show that an informative consideration of energy can be carried out under the energy landscape view. Specifically, we leverage local structures known as basins in the energy landscape probed by a template-free method. We propose and compare various strategies of basin-based decoy selection that we demonstrate are superior to clustering-based strategies. The presented results point to further directions of research for improving decoy selection, including the ability to properly consider the multiplicity of native conformations of proteins.
Vatansever, Fatma; Hamblin, Michael R
2017-02-01
New methods are needed for covalent functionalization of nanoparticles-surface with organic polymer coronas to generate polymeric nanocomposite in a controlled manner. Here we report the use of a surface-initiated polymerization approach, mediated by titanium (IV) catalysis, to grow poly( n -hexylisocyanate) chains from silica surface. Two pathways were used to generate the interfacing in these nano-hybrids. In the first one, the nanoparticles was "seeded" with SiCl4, followed by reaction with 1,6-hexanediol to form hydroxyl groups attached directly to the surface via O-Si-O bonding. In the second pathway, the nanoparticles were initially exposed to a 9:1 mixture of trimethyl silyl chloride and chlorodimethyl octenyl silane which was then followed by hydroboration of the double bonds, to afford hydroxyl groups with a spatially controlled density and surface-attachment via O-Si-C bonding. These functionalized surfaces were then activated with the titanium tetrachloride catalyst. In our approach, thus surface tethered catalyst provided the sites for n -hexyl isocyanate monomer insertion, to "build up" the surface-grown polymer layers from the "bottom-up". A final end-capping, to seal off the chain ends, was done via acetyl chloride. Compounds were characterized by FT-IR, 1H-NMR, GC-MS, GPC, and thermogravimetric analyses.
Vatansever, Fatma; Hamblin, Michael R.
2017-01-01
New methods are needed for covalent functionalization of nanoparticles-surface with organic polymer coronas to generate polymeric nanocomposite in a controlled manner. Here we report the use of a surface-initiated polymerization approach, mediated by titanium (IV) catalysis, to grow poly(n-hexylisocyanate) chains from silica surface. Two pathways were used to generate the interfacing in these nano-hybrids. In the first one, the nanoparticles was “seeded” with SiCl4, followed by reaction with 1,6-hexanediol to form hydroxyl groups attached directly to the surface via O-Si-O bonding. In the second pathway, the nanoparticles were initially exposed to a 9:1 mixture of trimethyl silyl chloride and chlorodimethyl octenyl silane which was then followed by hydroboration of the double bonds, to afford hydroxyl groups with a spatially controlled density and surface-attachment via O-Si-C bonding. These functionalized surfaces were then activated with the titanium tetrachloride catalyst. In our approach, thus surface tethered catalyst provided the sites for n-hexyl isocyanate monomer insertion, to “build up” the surface-grown polymer layers from the “bottom-up”. A final end-capping, to seal off the chain ends, was done via acetyl chloride. Compounds were characterized by FT-IR, 1H-NMR, GC-MS, GPC, and thermogravimetric analyses. PMID:28989336
NASA Astrophysics Data System (ADS)
Haber, Jonah; Refaely-Abramson, Sivan; da Jornada, Felipe H.; Louie, Steven G.; Neaton, Jeffrey B.
Multi-exciton generation processes, in which multiple charge carriers are generated from a single photon, are mechanisms of significant interest for achieving efficiencies beyond the Shockley-Queisser limit of conventional p-n junction solar cells. One well-studied multiexciton process is singlet fission, whereby a singlet decays into two spin-correlated triplet excitons. Here, we use a newly developed computational approach to calculate singlet-fission coupling terms and rates with an ab initio Green's function formalism based on many-body perturbation theory (MBPT) within the GW approximation and the Bethe-Salpeter equation approach. We compare results for crystalline pentacene and TIPS-pentacene and explore the effect of molecular packing on the singlet fission mechanism. This work is supported by the Department of Energy.
Software for Simulation of Hyperspectral Images
NASA Technical Reports Server (NTRS)
Richtsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.
2002-01-01
A package of software generates simulated hyperspectral images for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport as well as surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, 'ground truth' is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces and the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for and a supplement to field validation data.
Simulation of Hyperspectral Images
NASA Technical Reports Server (NTRS)
Richsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.
2004-01-01
A software package generates simulated hyperspectral imagery for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport, as well as reflections from surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, "ground truth" is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces, as well as the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for, and a supplement to, field validation data.
Bowerman, Bruce
2011-10-01
Molecular genetic investigation of the early Caenorhabditis elegans embryo has contributed substantially to the discovery and general understanding of the genes, pathways, and mechanisms that regulate and execute developmental and cell biological processes. Initially, worm geneticists relied exclusively on a classical genetics approach, isolating mutants with interesting phenotypes after mutagenesis and then determining the identity of the affected genes. Subsequently, the discovery of RNA interference (RNAi) led to a much greater reliance on a reverse genetics approach: reducing the function of known genes with RNAi and then observing the phenotypic consequences. Now the advent of next-generation DNA sequencing technologies and the ensuing ease and affordability of whole-genome sequencing are reviving the use of classical genetics to investigate early C. elegans embryogenesis.
Swarm formation control utilizing elliptical surfaces and limiting functions.
Barnes, Laura E; Fields, Mary Anne; Valavanis, Kimon P
2009-12-01
In this paper, we present a strategy for organizing swarms of unmanned vehicles into a formation by utilizing artificial potential fields that were generated from normal and sigmoid functions. These functions construct the surface on which swarm members travel, controlling the overall swarm geometry and the individual member spacing. Nonlinear limiting functions are defined to provide tighter swarm control by modifying and adjusting a set of control variables that force the swarm to behave according to set constraints, formation, and member spacing. The artificial potential functions and limiting functions are combined to control swarm formation, orientation, and swarm movement as a whole. Parameters are chosen based on desired formation and user-defined constraints. This approach is computationally efficient and scales well to different swarm sizes, to heterogeneous systems, and to both centralized and decentralized swarm models. Simulation results are presented for a swarm of 10 and 40 robots that follow circle, ellipse, and wedge formations. Experimental results are included to demonstrate the applicability of the approach on a swarm of four custom-built unmanned ground vehicles (UGVs).
NASA Astrophysics Data System (ADS)
Petržala, Jaromír
2018-07-01
The knowledge of the emission function of a city is crucial for simulation of sky glow in its vicinity. The indirect methods to achieve this function from radiances measured over a part of the sky have been recently developed. In principle, such methods represent an ill-posed inverse problem. This paper deals with the theoretical feasibility study of various approaches to solving of given inverse problem. Particularly, it means testing of fitness of various stabilizing functionals within the Tikhonov's regularization. Further, the L-curve and generalized cross validation methods were investigated as indicators of an optimal regularization parameter. At first, we created the theoretical model for calculation of a sky spectral radiance in the form of a functional of an emission spectral radiance. Consequently, all the mentioned approaches were examined in numerical experiments with synthetical data generated for the fictitious city and loaded by random errors. The results demonstrate that the second order Tikhonov's regularization method together with regularization parameter choice by the L-curve maximum curvature criterion provide solutions which are in good agreement with the supposed model emission functions.
Davidsson, Marcus; Diaz-Fernandez, Paula; Schwich, Oliver D.; Torroba, Marcos; Wang, Gang; Björklund, Tomas
2016-01-01
Detailed characterization and mapping of oligonucleotide function in vivo is generally a very time consuming effort that only allows for hypothesis driven subsampling of the full sequence to be analysed. Recent advances in deep sequencing together with highly efficient parallel oligonucleotide synthesis and cloning techniques have, however, opened up for entirely new ways to map genetic function in vivo. Here we present a novel, optimized protocol for the generation of universally applicable, barcode labelled, plasmid libraries. The libraries are designed to enable the production of viral vector preparations assessing coding or non-coding RNA function in vivo. When generating high diversity libraries, it is a challenge to achieve efficient cloning, unambiguous barcoding and detailed characterization using low-cost sequencing technologies. With the presented protocol, diversity of above 3 million uniquely barcoded adeno-associated viral (AAV) plasmids can be achieved in a single reaction through a process achievable in any molecular biology laboratory. This approach opens up for a multitude of in vivo assessments from the evaluation of enhancer and promoter regions to the optimization of genome editing. The generated plasmid libraries are also useful for validation of sequencing clustering algorithms and we here validate the newly presented message passing clustering process named Starcode. PMID:27874090
Global optimization framework for solar building design
NASA Astrophysics Data System (ADS)
Silva, N.; Alves, N.; Pascoal-Faria, P.
2017-07-01
The generative modeling paradigm is a shift from static models to flexible models. It describes a modeling process using functions, methods and operators. The result is an algorithmic description of the construction process. Each evaluation of such an algorithm creates a model instance, which depends on its input parameters (width, height, volume, roof angle, orientation, location). These values are normally chosen according to aesthetic aspects and style. In this study, the model's parameters are automatically generated according to an objective function. A generative model can be optimized according to its parameters, in this way, the best solution for a constrained problem is determined. Besides the establishment of an overall framework design, this work consists on the identification of different building shapes and their main parameters, the creation of an algorithmic description for these main shapes and the formulation of the objective function, respecting a building's energy consumption (solar energy, heating and insulation). Additionally, the conception of an optimization pipeline, combining an energy calculation tool with a geometric scripting engine is presented. The methods developed leads to an automated and optimized 3D shape generation for the projected building (based on the desired conditions and according to specific constrains). The approach proposed will help in the construction of real buildings that account for less energy consumption and for a more sustainable world.
Phase dilemma in natural orbital functional theory from the N-representability perspective
NASA Astrophysics Data System (ADS)
Mitxelena, Ion; Rodriguez-Mayorga, Mauricio; Piris, Mario
2018-06-01
Any rigorous approach to first-order reduced density matrix ( Γ) functional theory faces the phase dilemma, that is, having to deal with a large number of possible combinations of signs in terms of the electron-electron interaction energy. This problem was discovered by reducing a ground-state energy generated from an approximate N-particle wavefunction into a functional of Γ, known as the top-down method. Here, we show that the phase dilemma also appears in the bottom-up method, in which the functional E[ Γ] is generated by progressive inclusion of N-representability conditions on the reconstructed two-particle reduced density matrix. It is shown that an adequate choice of signs is essential to accurately describe model systems with strong non-dynamic (static) electron correlation, specifically, the one-dimensional Hubbard model with periodic boundary conditions and hydrogen rings. For the latter, the Piris natural orbital functional 7 (PNOF7), with phases equal to -1 for the inter-pair energy terms containing the exchange-time-inversion integrals, agrees with exact diagonalization results.
Adaptive skin segmentation via feature-based face detection
NASA Astrophysics Data System (ADS)
Taylor, Michael J.; Morris, Tim
2014-05-01
Variations in illumination can have significant effects on the apparent colour of skin, which can be damaging to the efficacy of any colour-based segmentation approach. We attempt to overcome this issue by presenting a new adaptive approach, capable of generating skin colour models at run-time. Our approach adopts a Viola-Jones feature-based face detector, in a moderate-recall, high-precision configuration, to sample faces within an image, with an emphasis on avoiding potentially detrimental false positives. From these samples, we extract a set of pixels that are likely to be from skin regions, filter them according to their relative luma values in an attempt to eliminate typical non-skin facial features (eyes, mouths, nostrils, etc.), and hence establish a set of pixels that we can be confident represent skin. Using this representative set, we train a unimodal Gaussian function to model the skin colour in the given image in the normalised rg colour space - a combination of modelling approach and colour space that benefits us in a number of ways. A generated function can subsequently be applied to every pixel in the given image, and, hence, the probability that any given pixel represents skin can be determined. Segmentation of the skin, therefore, can be as simple as applying a binary threshold to the calculated probabilities. In this paper, we touch upon a number of existing approaches, describe the methods behind our new system, present the results of its application to arbitrary images of people with detectable faces, which we have found to be extremely encouraging, and investigate its potential to be used as part of real-time systems.
Automated and model-based assembly of an anamorphic telescope
NASA Astrophysics Data System (ADS)
Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter
2018-02-01
Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.
Translation-aware semantic segmentation via conditional least-square generative adversarial networks
NASA Astrophysics Data System (ADS)
Zhang, Mi; Hu, Xiangyun; Zhao, Like; Pang, Shiyan; Gong, Jinqi; Luo, Min
2017-10-01
Semantic segmentation has recently made rapid progress in the field of remote sensing and computer vision. However, many leading approaches cannot simultaneously translate label maps to possible source images with a limited number of training images. The core issue is insufficient adversarial information to interpret the inverse process and proper objective loss function to overcome the vanishing gradient problem. We propose the use of conditional least squares generative adversarial networks (CLS-GAN) to delineate visual objects and solve these problems. We trained the CLS-GAN network for semantic segmentation to discriminate dense prediction information either from training images or generative networks. We show that the optimal objective function of CLS-GAN is a special class of f-divergence and yields a generator that lies on the decision boundary of discriminator that reduces possible vanished gradient. We also demonstrate the effectiveness of the proposed architecture at translating images from label maps in the learning process. Experiments on a limited number of high resolution images, including close-range and remote sensing datasets, indicate that the proposed method leads to the improved semantic segmentation accuracy and can simultaneously generate high quality images from label maps.
Discrimination Power of Polynomial-Based Descriptors for Graphs by Using Functional Matrices.
Dehmer, Matthias; Emmert-Streib, Frank; Shi, Yongtang; Stefu, Monica; Tripathi, Shailesh
2015-01-01
In this paper, we study the discrimination power of graph measures that are based on graph-theoretical matrices. The paper generalizes the work of [M. Dehmer, M. Moosbrugger. Y. Shi, Encoding structural information uniquely with polynomial-based descriptors by employing the Randić matrix, Applied Mathematics and Computation, 268(2015), 164-168]. We demonstrate that by using the new functional matrix approach, exhaustively generated graphs can be discriminated more uniquely than shown in the mentioned previous work.
Assessment of cockpit interface concepts for data link retrofit
NASA Technical Reports Server (NTRS)
Mccauley, Hugh W.; Miles, William L.; Dwyer, John P.; Erickson, Jeffery B.
1992-01-01
The problem is examined of retrofitting older generation aircraft with data link capability. The approach taken analyzes requirements for the cockpit interface, based on review of prior research and opinions obtained from subject matter experts. With this background, essential functions and constraints for a retrofit installation are defined. After an assessment of the technology available to meet the functions and constraints, candidate design concepts are developed. The most promising design concept is described in detail. Finally, needs for further research and development are identified.
Discrimination Power of Polynomial-Based Descriptors for Graphs by Using Functional Matrices
Dehmer, Matthias; Emmert-Streib, Frank; Shi, Yongtang; Stefu, Monica; Tripathi, Shailesh
2015-01-01
In this paper, we study the discrimination power of graph measures that are based on graph-theoretical matrices. The paper generalizes the work of [M. Dehmer, M. Moosbrugger. Y. Shi, Encoding structural information uniquely with polynomial-based descriptors by employing the Randić matrix, Applied Mathematics and Computation, 268(2015), 164–168]. We demonstrate that by using the new functional matrix approach, exhaustively generated graphs can be discriminated more uniquely than shown in the mentioned previous work. PMID:26479495
Probability density cloud as a geometrical tool to describe statistics of scattered light.
Yaitskova, Natalia
2017-04-01
First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.
Using Evolutionary Theory to Guide Mental Health Research.
Durisko, Zachary; Mulsant, Benoit H; McKenzie, Kwame; Andrews, Paul W
2016-03-01
Evolutionary approaches to medicine can shed light on the origins and etiology of disease. Such an approach may be especially useful in psychiatry, which frequently addresses conditions with heterogeneous presentation and unknown causes. We review several previous applications of evolutionary theory that highlight the ways in which psychiatric conditions may persist despite and because of natural selection. One lesson from the evolutionary approach is that some conditions currently classified as disorders (because they cause distress and impairment) may actually be caused by functioning adaptations operating "normally" (as designed by natural selection). Such conditions suggest an alternative illness model that may generate alternative intervention strategies. Thus, the evolutionary approach suggests that psychiatry should sometimes think differently about distress and impairment. The complexity of the human brain, including normal functioning and potential for dysfunctions, has developed over evolutionary time and has been shaped by natural selection. Understanding the evolutionary origins of psychiatric conditions is therefore a crucial component to a complete understanding of etiology. © The Author(s) 2016.
Using Evolutionary Theory to Guide Mental Health Research
Mulsant, Benoit H.; McKenzie, Kwame; Andrews, Paul W.
2016-01-01
Evolutionary approaches to medicine can shed light on the origins and etiology of disease. Such an approach may be especially useful in psychiatry, which frequently addresses conditions with heterogeneous presentation and unknown causes. We review several previous applications of evolutionary theory that highlight the ways in which psychiatric conditions may persist despite and because of natural selection. One lesson from the evolutionary approach is that some conditions currently classified as disorders (because they cause distress and impairment) may actually be caused by functioning adaptations operating “normally” (as designed by natural selection). Such conditions suggest an alternative illness model that may generate alternative intervention strategies. Thus, the evolutionary approach suggests that psychiatry should sometimes think differently about distress and impairment. The complexity of the human brain, including normal functioning and potential for dysfunctions, has developed over evolutionary time and has been shaped by natural selection. Understanding the evolutionary origins of psychiatric conditions is therefore a crucial component to a complete understanding of etiology. PMID:27254091
Xi, Jianing; Wang, Minghui; Li, Ao
2017-09-26
The accumulating availability of next-generation sequencing data offers an opportunity to pinpoint driver genes that are causally implicated in oncogenesis through computational models. Despite previous efforts made regarding this challenging problem, there is still room for improvement in the driver gene identification accuracy. In this paper, we propose a novel integrated approach called IntDriver for prioritizing driver genes. Based on a matrix factorization framework, IntDriver can effectively incorporate functional information from both the interaction network and Gene Ontology similarity, and detect driver genes mutated in different sets of patients at the same time. When evaluated through known benchmarking driver genes, the top ranked genes of our result show highly significant enrichment for the known genes. Meanwhile, IntDriver also detects some known driver genes that are not found by the other competing approaches. When measured by precision, recall and F1 score, the performances of our approach are comparable or increased in comparison to the competing approaches.
Lanczos algorithm with matrix product states for dynamical correlation functions
NASA Astrophysics Data System (ADS)
Dargel, P. E.; Wöllert, A.; Honecker, A.; McCulloch, I. P.; Schollwöck, U.; Pruschke, T.
2012-05-01
The density-matrix renormalization group (DMRG) algorithm can be adapted to the calculation of dynamical correlation functions in various ways which all represent compromises between computational efficiency and physical accuracy. In this paper we reconsider the oldest approach based on a suitable Lanczos-generated approximate basis and implement it using matrix product states (MPS) for the representation of the basis states. The direct use of matrix product states combined with an ex post reorthogonalization method allows us to avoid several shortcomings of the original approach, namely the multitargeting and the approximate representation of the Hamiltonian inherent in earlier Lanczos-method implementations in the DMRG framework, and to deal with the ghost problem of Lanczos methods, leading to a much better convergence of the spectral weights and poles. We present results for the dynamic spin structure factor of the spin-1/2 antiferromagnetic Heisenberg chain. A comparison to Bethe ansatz results in the thermodynamic limit reveals that the MPS-based Lanczos approach is much more accurate than earlier approaches at minor additional numerical cost.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh
1998-01-01
An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.
Ketelaar, Sarah M; Nieuwenhuijsen, Karen; Bolier, Linda; Smeets, Odile; Sluiter, Judith K
2014-12-01
Mental health complaints are quite common in health care employees and can have adverse effects on work functioning. The aim of this study was to evaluate an e-mental health (EMH) approach to workers' health surveillance (WHS) for nurses and allied health professionals. Using the waiting-list group of a previous randomized controlled trial with high dropout and low compliance to the intervention, we studied the pre- and posteffects of the EMH approach in a larger group of participants. We applied a pretest-posttest study design. The WHS consisted of online screening on impaired work functioning and mental health followed by online automatically generated personalized feedback, online tailored advice, and access to self-help EMH interventions. The effects on work functioning, stress, and work-related fatigue after 3 months were analyzed using paired t tests and effect sizes. One hundred and twenty-eight nurses and allied health professionals participated at pretest as well as posttest. Significant improvements were found on work functioning (p = 0.01) and work-related fatigue (p < 0.01). Work functioning had relevantly improved in 30% of participants. A small meaningful effect on stress was found (Cohen d = .23) in the participants who had logged onto an EMH intervention (20%, n = 26). The EMH approach to WHS improves the work functioning and mental health of nurses and allied health professionals. However, because we found small effects and participation in the offered EMH interventions was low, there is ample room for improvement.
Convergent close-coupling approach to positron scattering on He+★
NASA Astrophysics Data System (ADS)
Rawlins, Charlie M.; Kadyrov, Alisher S.; Bray, Igor
2018-05-01
A close-coupling method is used to generate electron-loss and total scattering cross sections for the first three partial waves with both a single-centre and two-centre expansion of the scattering wave function for positron scattering on He +. The two expansions are consistent with each other above the ionisation threshold verifying newly-developed positronium-formation matrix elements. Below the positronium-formation threshold both the single- and two-centre results agree with the elastic-scattering cross sections generated from the phase shifts reported in previous calculations.
Generation of stable human cell lines with Tetracycline-inducible (Tet-on) shRNA or cDNA expression.
Gomez-Martinez, Marta; Schmitz, Debora; Hergovich, Alexander
2013-03-05
A major approach in the field of mammalian cell biology is the manipulation of the expression of genes of interest in selected cell lines, with the aim to reveal one or several of the gene's function(s) using transient/stable overexpression or knockdown of the gene of interest. Unfortunately, for various cell biological investigations this approach is unsuitable when manipulations of gene expression result in cell growth/proliferation defects or unwanted cell differentiation. Therefore, researchers have adapted the Tetracycline repressor protein (TetR), taken from the E. coli tetracycline resistance operon(1), to generate very efficient and tight regulatory systems to express cDNAs in mammalian cells(2,3). In short, TetR has been modified to either (1) block initiation of transcription by binding to the Tet-operator (TO) in the promoter region upon addition of tetracycline (termed Tet-off system) or (2) bind to the TO in the absence of tetracycline (termed Tet-on system) (Figure 1). Given the inconvenience that the Tet-off system requires the continuous presence of tetracycline (which has a half-life of about 24 hr in tissue cell culture medium) the Tet-on system has been more extensively optimized, resulting in the development of very tight and efficient vector systems for cDNA expression as used here. Shortly after establishment of RNA interference (RNAi) for gene knockdown in mammalian cells(4), vectors expressing short-hairpin RNAs (shRNAs) were described that function very similar to siRNAs(5-11). However, these shRNA-mediated knockdown approaches have the same limitation as conventional knockout strategies, since stable depletion is not feasible when gene targets are essential for cellular survival. To overcome this limitation, van de Wetering et al.(12) modified the shRNA expression vector pSUPER(5) by inserting a TO in the promoter region, which enabled them to generate stable cell lines with tetracycline-inducible depletion of their target genes of interest. Here, we describe a method to efficiently generate stable human Tet-on cell lines that reliably drive either inducible overexpression or depletion of the gene of interest. Using this method, we have successfully generated Tet-on cell lines which significantly facilitated the analysis of the MST/hMOB/NDR cascade in centrosome(13,14) and apoptosis signaling(15,16). In this report, we describe our vectors of choice, in addition to describing the two consecutive manipulation steps that are necessary to efficiently generate human Tet-on cell lines (Figure 2). Moreover, besides outlining a protocol for the generation of human Tet-on cell lines, we will discuss critical aspects regarding the technical procedures and the characterization of Tet-on cells.
An Alternative to the Gauge Theoretic Setting
NASA Astrophysics Data System (ADS)
Schroer, Bert
2011-10-01
The standard formulation of quantum gauge theories results from the Lagrangian (functional integral) quantization of classical gauge theories. A more intrinsic quantum theoretical access in the spirit of Wigner's representation theory shows that there is a fundamental clash between the pointlike localization of zero mass (vector, tensor) potentials and the Hilbert space (positivity, unitarity) structure of QT. The quantization approach has no other way than to stay with pointlike localization and sacrifice the Hilbert space whereas the approach built on the intrinsic quantum concept of modular localization keeps the Hilbert space and trades the conflict creating pointlike generation with the tightest consistent localization: semiinfinite spacelike string localization. Whereas these potentials in the presence of interactions stay quite close to associated pointlike field strengths, the interacting matter fields to which they are coupled bear the brunt of the nonlocal aspect in that they are string-generated in a way which cannot be undone by any differentiation. The new stringlike approach to gauge theory also revives the idea of a Schwinger-Higgs screening mechanism as a deeper and less metaphoric description of the Higgs spontaneous symmetry breaking and its accompanying tale about "God's particle" and its mass generation for all the other particles.
A multi-criteria approach to camera motion design for volume data animation.
Hsu, Wei-Hsien; Zhang, Yubo; Ma, Kwan-Liu
2013-12-01
We present an integrated camera motion design and path generation system for building volume data animations. Creating animations is an essential task in presenting complex scientific visualizations. Existing visualization systems use an established animation function based on keyframes selected by the user. This approach is limited in providing the optimal in-between views of the data. Alternatively, computer graphics and virtual reality camera motion planning is frequently focused on collision free movement in a virtual walkthrough. For semi-transparent, fuzzy, or blobby volume data the collision free objective becomes insufficient. Here, we provide a set of essential criteria focused on computing camera paths to establish effective animations of volume data. Our dynamic multi-criteria solver coupled with a force-directed routing algorithm enables rapid generation of camera paths. Once users review the resulting animation and evaluate the camera motion, they are able to determine how each criterion impacts path generation. In this paper, we demonstrate how incorporating this animation approach with an interactive volume visualization system reduces the effort in creating context-aware and coherent animations. This frees the user to focus on visualization tasks with the objective of gaining additional insight from the volume data.
Deng, Zhimin; Tian, Tianhai
2014-07-29
The advances of systems biology have raised a large number of sophisticated mathematical models for describing the dynamic property of complex biological systems. One of the major steps in developing mathematical models is to estimate unknown parameters of the model based on experimentally measured quantities. However, experimental conditions limit the amount of data that is available for mathematical modelling. The number of unknown parameters in mathematical models may be larger than the number of observation data. The imbalance between the number of experimental data and number of unknown parameters makes reverse-engineering problems particularly challenging. To address the issue of inadequate experimental data, we propose a continuous optimization approach for making reliable inference of model parameters. This approach first uses a spline interpolation to generate continuous functions of system dynamics as well as the first and second order derivatives of continuous functions. The expanded dataset is the basis to infer unknown model parameters using various continuous optimization criteria, including the error of simulation only, error of both simulation and the first derivative, or error of simulation as well as the first and second derivatives. We use three case studies to demonstrate the accuracy and reliability of the proposed new approach. Compared with the corresponding discrete criteria using experimental data at the measurement time points only, numerical results of the ERK kinase activation module show that the continuous absolute-error criteria using both function and high order derivatives generate estimates with better accuracy. This result is also supported by the second and third case studies for the G1/S transition network and the MAP kinase pathway, respectively. This suggests that the continuous absolute-error criteria lead to more accurate estimates than the corresponding discrete criteria. We also study the robustness property of these three models to examine the reliability of estimates. Simulation results show that the models with estimated parameters using continuous fitness functions have better robustness properties than those using the corresponding discrete fitness functions. The inference studies and robustness analysis suggest that the proposed continuous optimization criteria are effective and robust for estimating unknown parameters in mathematical models.
Test aspects of the JPL Viterbi decoder
NASA Technical Reports Server (NTRS)
Breuer, M. A.
1989-01-01
The generation of test vectors and design-for-test aspects of the Jet Propulsion Laboratory (JPL) Very Large Scale Integration (VLSI) Viterbi decoder chip is discussed. Each processor integrated circuit (IC) contains over 20,000 gates. To achieve a high degree of testability, a scan architecture is employed. The logic has been partitioned so that very few test vectors are required to test the entire chip. In addition, since several blocks of logic are replicated numerous times on this chip, test vectors need only be generated for each block, rather than for the entire circuit. These unique blocks of logic have been identified and test sets generated for them. The approach employed for testing was to use pseudo-exhaustive test vectors whenever feasible. That is, each cone of logid is tested exhaustively. Using this approach, no detailed logic design or fault model is required. All faults which modify the function of a block of combinational logic are detected, such as all irredundant single and multiple stuck-at faults.
Designer nanoparticle: nanobiotechnology tool for cell biology
NASA Astrophysics Data System (ADS)
Thimiri Govinda Raj, Deepak B.; Khan, Niamat Ali
2016-09-01
This article discusses the use of nanotechnology for subcellular compartment isolation and its application towards subcellular omics. This technology review significantly contributes to our understanding on use of nanotechnology for subcellular systems biology. Here we elaborate nanobiotechnology approach of using superparamagnetic nanoparticles (SPMNPs) optimized with different surface coatings for subcellular organelle isolation. Using pulse-chase approach, we review that SPMNPs interacted differently with the cell depending on its surface functionalization. The article focuses on the use of functionalized-SPMNPs as a nanobiotechnology tool to isolate high quality (both purity and yield) plasma membranes and endosomes or lysosomes. Such nanobiotechnology tool can be applied in generating subcellular compartment inventories. As a future perspective, this strategy could be applied in areas such as immunology, cancer and stem cell research.
Designer nanoparticle: nanobiotechnology tool for cell biology.
Thimiri Govinda Raj, Deepak B; Khan, Niamat Ali
2016-01-01
This article discusses the use of nanotechnology for subcellular compartment isolation and its application towards subcellular omics. This technology review significantly contributes to our understanding on use of nanotechnology for subcellular systems biology. Here we elaborate nanobiotechnology approach of using superparamagnetic nanoparticles (SPMNPs) optimized with different surface coatings for subcellular organelle isolation. Using pulse-chase approach, we review that SPMNPs interacted differently with the cell depending on its surface functionalization. The article focuses on the use of functionalized-SPMNPs as a nanobiotechnology tool to isolate high quality (both purity and yield) plasma membranes and endosomes or lysosomes. Such nanobiotechnology tool can be applied in generating subcellular compartment inventories. As a future perspective, this strategy could be applied in areas such as immunology, cancer and stem cell research.
Sultan, Iyad; Senkal, Can E.; Ponnusamy, Suriyan; Bielawski, Jacek; Szulc, Zdzislaw; Bielawska, Alicja; Hannun, Yusuf A.; Ogretmen, Besim
2005-01-01
In the present study, the regulation of the sphingosine-recycling pathway in A549 human lung adenocarcinoma cells by oxidative stress was investigated. The generation of endogenous long-chain ceramide in response to exogenous C6-cer (C6-ceramide), which is FB1 (fumonisin B1)-sensitive, was employed to probe the sphingosine-recycling pathway. The data showed that ceramide formation via this pathway was significantly blocked by GSH and NAC (N-acetylcysteine) whereas it was enhanced by H2O2, as detected by both palmitate labelling and HPLC/MS. Similar data were also obtained using a novel approach that measures the incorporation of 17Sph (sphingosine containing 17 carbons) of 17C6-cer (C6-cer containing a 17Sph backbone) into long-chain 17C16-cer in cells by HPLC/MS, which was significantly decreased and increased in response to GSH and H2O2 respectively. TNF (tumour necrosis factor)-α, which decreases the levels of endogenous GSH, increased the generation of C16-cer in response to C6-cer, and this was blocked by exogenous GSH or NAC, or by the overexpression of TPx I (thioredoxin peroxidase I), an enzyme that reduces the generation of intracellular ROS (reactive oxygen species). Additional data showed that ROS regulated both the deacylation and reacylation steps of C6-cer. At a functional level, C6-cer inhibited the DNA-binding function of the c-Myc/Max oncogene. Inhibition of the generation of longchain ceramide in response to C6-cer by FB1 or NAC significantly blocked the modulation of the c-Myc/Max function. These data demonstrate that the sphingosine-recycling pathway for the generation of endogenous long-chain ceramide in response to exogenous C6-cer is regulated by ROS, and plays an important biological role in controlling c-Myc function. PMID:16201965
NASA Astrophysics Data System (ADS)
Shabani, H.; Doblas, A.; Saavedra, G.; Preza, C.
2018-02-01
The restored images in structured illumination microscopy (SIM) can be affected by residual fringes due to a mismatch between the illumination pattern and the sinusoidal model assumed by the restoration method. When a Fresnel biprism is used to generate a structured pattern, this pattern cannot be described by a pure sinusoidal function since it is distorted by an envelope due to the biprism's edge. In this contribution, we have investigated the effect of the envelope on the restored SIM images and propose a computational method in order to address it. The proposed approach to reduce the effect of the envelope consists of two parts. First, the envelope of the structured pattern, determined through calibration data, is removed from the raw SIM data via a preprocessing step. In the second step, a notch filter is applied to the images, which are restored using the well-known generalized Wiener filter, to filter any residual undesired fringes. The performance of our approach has been evaluated numerically by simulating the effect of the envelope on synthetic forward images of a 6-μm spherical bead generated using the real pattern and then restored using the SIM approach that is based on an ideal pure sinusoidal function before and after our proposed correction method. The simulation result shows 74% reduction in the contrast of the residual pattern when the proposed method is applied. Experimental results from a pollen grain sample also validate the proposed approach.
Tirone, Felice; Farioli-Vecchioli, Stefano; Micheli, Laura; Ceccarelli, Manuela; Leonardi, Luca
2013-01-01
Within the hippocampal circuitry, the basic function of the dentate gyrus is to transform the memory input coming from the enthorinal cortex into sparse and categorized outputs to CA3, in this way separating related memory information. New neurons generated in the dentate gyrus during adulthood appear to facilitate this process, allowing a better separation between closely spaced memories (pattern separation). The evidence underlying this model has been gathered essentially by ablating the newly adult-generated neurons. This approach, however, does not allow monitoring of the integration of new neurons into memory circuits and is likely to set in motion compensatory circuits, possibly leading to an underestimation of the role of new neurons. Here we review the background of the basic function of the hippocampus and of the known properties of new adult-generated neurons. In this context, we analyze the cognitive performance in mouse models generated by us and others, with modified expression of the genes Btg2 (PC3/Tis21), Btg1, Pten, BMP4, etc., where new neurons underwent a change in their differentiation rate or a partial decrease of their proliferation or survival rate rather than ablation. The effects of these modifications are equal or greater than full ablation, suggesting that the architecture of circuits, as it unfolds from the interaction between existing and new neurons, can have a greater functional impact than the sheer number of new neurons. We propose a model which attempts to measure and correlate the set of cellular changes in the process of neurogenesis with the memory function. PMID:23734097
An optimal strategy for functional mapping of dynamic trait loci.
Jin, Tianbo; Li, Jiahan; Guo, Ying; Zhou, Xiaojing; Yang, Runqing; Wu, Rongling
2010-02-01
As an emerging powerful approach for mapping quantitative trait loci (QTLs) responsible for dynamic traits, functional mapping models the time-dependent mean vector with biologically meaningful equations and are likely to generate biologically relevant and interpretable results. Given the autocorrelation nature of a dynamic trait, functional mapping needs the implementation of the models for the structure of the covariance matrix. In this article, we have provided a comprehensive set of approaches for modelling the covariance structure and incorporated each of these approaches into the framework of functional mapping. The Bayesian information criterion (BIC) values are used as a model selection criterion to choose the optimal combination of the submodels for the mean vector and covariance structure. In an example for leaf age growth from a rice molecular genetic project, the best submodel combination was found between the Gaussian model for the correlation structure, power equation of order 1 for the variance and the power curve for the mean vector. Under this combination, several significant QTLs for leaf age growth trajectories were detected on different chromosomes. Our model can be well used to study the genetic architecture of dynamic traits of agricultural values.
Advanced Unstructured Grid Generation for Complex Aerodynamic Applications
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar
2010-01-01
A new approach for distribution of grid points on the surface and in the volume has been developed. In addition to the point and line sources of prior work, the new approach utilizes surface and volume sources for automatic curvature-based grid sizing and convenient point distribution in the volume. A new exponential growth function produces smoother and more efficient grids and provides superior control over distribution of grid points in the field. All types of sources support anisotropic grid stretching which not only improves the grid economy but also provides more accurate solutions for certain aerodynamic applications. The new approach does not require a three-dimensional background grid as in the previous methods. Instead, it makes use of an efficient bounding-box auxiliary medium for storing grid parameters defined by surface sources. The new approach is less memory-intensive and more efficient computationally. The grids generated with the new method either eliminate the need for adaptive grid refinement for certain class of problems or provide high quality initial grids that would enhance the performance of many adaptation methods.
Rapid and Programmable Protein Mutagenesis Using Plasmid Recombineering.
Higgins, Sean A; Ouonkap, Sorel V Y; Savage, David F
2017-10-20
Comprehensive and programmable protein mutagenesis is critical for understanding structure-function relationships and improving protein function. There is thus a need for robust and unbiased molecular biological approaches for the construction of the requisite comprehensive protein libraries. Here we demonstrate that plasmid recombineering is a simple and robust in vivo method for the generation of protein mutants for both comprehensive library generation as well as programmable targeting of sequence space. Using the fluorescent protein iLOV as a model target, we build a complete mutagenesis library and find it to be specific and comprehensive, detecting 99.8% of our intended mutations. We then develop a thermostability screen and utilize our comprehensive mutation data to rapidly construct a targeted and multiplexed library that identifies significantly improved variants, thus demonstrating rapid protein engineering in a simple protocol.
Metagenomics of Thermophiles with a Focus on Discovery of Novel Thermozymes
DeCastro, María-Eugenia; Rodríguez-Belmonte, Esther; González-Siso, María-Isabel
2016-01-01
Microbial populations living in environments with temperatures above 50°C (thermophiles) have been widely studied, increasing our knowledge in the composition and function of these ecological communities. Since these populations express a broad number of heat-resistant enzymes (thermozymes), they also represent an important source for novel biocatalysts that can be potentially used in industrial processes. The integrated study of the whole-community DNA from an environment, known as metagenomics, coupled with the development of next generation sequencing (NGS) technologies, has allowed the generation of large amounts of data from thermophiles. In this review, we summarize the main approaches commonly utilized for assessing the taxonomic and functional diversity of thermophiles through metagenomics, including several bioinformatics tools and some metagenome-derived methods to isolate their thermozymes. PMID:27729905
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Yun, E-mail: ygao@yorku.ca; Hu, Naihong, E-mail: nhhu@math.ecnu.edu.cn; Zhang, Honglian, E-mail: hlzhangmath@shu.edu.cn
In this paper, we define the two-parameter quantum affine algebra for type G{sub 2}{sup (1)} and give the (r, s)-Drinfeld realization of U{sub r,s}(G{sub 2}{sup (1)}), as well as establish and prove its Drinfeld isomorphism. We construct and verify explicitly the level-one vertex representation of two-parameter quantum affine algebra U{sub r,s}(G{sub 2}{sup (1)}), which also supports an evidence in nontwisted type G{sub 2}{sup (1)} for the uniform defining approach via the two-parameter τ-invariant generating functions proposed in Hu and Zhang [Generating functions with τ-invariance and vertex representations of two-parameter quantum affine algebras U{sub r,s}(g{sup ^}): Simply laced cases e-print http://arxiv.org/abs/1401.4925more » ].« less
Discovery of cancer drug targets by CRISPR-Cas9 screening of protein domains.
Shi, Junwei; Wang, Eric; Milazzo, Joseph P; Wang, Zihua; Kinney, Justin B; Vakoc, Christopher R
2015-06-01
CRISPR-Cas9 genome editing technology holds great promise for discovering therapeutic targets in cancer and other diseases. Current screening strategies target CRISPR-Cas9-induced mutations to the 5' exons of candidate genes, but this approach often produces in-frame variants that retain functionality, which can obscure even strong genetic dependencies. Here we overcome this limitation by targeting CRISPR-Cas9 mutagenesis to exons encoding functional protein domains. This generates a higher proportion of null mutations and substantially increases the potency of negative selection. We also show that the magnitude of negative selection can be used to infer the functional importance of individual protein domains of interest. A screen of 192 chromatin regulatory domains in murine acute myeloid leukemia cells identifies six known drug targets and 19 additional dependencies. A broader application of this approach may allow comprehensive identification of protein domains that sustain cancer cells and are suitable for drug targeting.
Investigation of another approach in topology optimization
NASA Astrophysics Data System (ADS)
Krotkikh, A. A.; Maximov, P. V.
2018-05-01
The paper presents investigation of another approach in topology optimization. The authors realized the method of topology optimization with using ideas of the SIMP method which was created by Martin P. Bends0e. There are many ways in objective function formulation of topology optimization methods. In terms of elasticity theory, the objective function of the SIMP method is a compliance of an object which should be minimized. The main idea of this paper was avoiding the filtering procedure in the SIMP method. Reformulation of the statement of the problem in terms of function minimization allows us to solve this by big variety of methods. The authors decided to use the interior point method which was realized in Wolfram Mathematica. This way can generate side effects which should be investigated for preventing their appearing in future. Results comparison of the SIMP method and the suggested method are presented in paper and analyzed.
Nikdel, Ali; Braatz, Richard D; Budman, Hector M
2018-05-01
Dynamic flux balance analysis (DFBA) has become an instrumental modeling tool for describing the dynamic behavior of bioprocesses. DFBA involves the maximization of a biologically meaningful objective subject to kinetic constraints on the rate of consumption/production of metabolites. In this paper, we propose a systematic data-based approach for finding both the biological objective function and a minimum set of active constraints necessary for matching the model predictions to the experimental data. The proposed algorithm accounts for the errors in the experiments and eliminates the need for ad hoc choices of objective function and constraints as done in previous studies. The method is illustrated for two cases: (1) for in silico (simulated) data generated by a mathematical model for Escherichia coli and (2) for actual experimental data collected from the batch fermentation of Bordetella Pertussis (whooping cough).
Rotation of a Single Acetylene Molecule on Cu(001) by Tunneling Electrons in STM
NASA Astrophysics Data System (ADS)
Shchadilova, Yulia E.; Tikhodeev, Sergei G.; Paulsson, Magnus; Ueba, Hiromu
2013-11-01
We study the elementary processes behind one of the pioneering works on scanning tunneling microscope controlled reactions of single molecules [Stipe et al., Phys. Rev. Lett. 81, 1263 (1998)]. Using the Keldysh-Green function approach for the vibrational generation rate in combination with density functional theory calculations to obtain realistic parameters we reproduce the experimental rotation rate of an acetylene molecule on a Cu(100) surface as a function of bias voltage and tunneling current. This combined approach allows us to identify the reaction coordinate mode of the acetylene rotation and its anharmonic coupling with the C-H stretch mode. We show that three different elementary processes, the excitation of C-H stretch, the overtone ladder climbing of the hindered rotational mode, and the combination band excitation together explain the rotation of the acetylene molecule on Cu(100).
Molecular Screening Tools to Study Arabidopsis Transcription Factors
Wehner, Nora; Weiste, Christoph; Dröge-Laser, Wolfgang
2011-01-01
In the model plant Arabidopsis thaliana, more than 2000 genes are estimated to encode transcription factors (TFs), which clearly emphasizes the importance of transcriptional control. Although genomic approaches have generated large TF open reading frame (ORF) collections, only a limited number of these genes is functionally characterized, yet. This review evaluates strategies and methods to identify TF functions. In particular, we focus on two recently developed TF screening platforms, which make use of publically available GATEWAY®-compatible ORF collections. (1) The Arabidopsis thaliana TF ORF over-Expression (AtTORF-Ex) library provides pooled collections of transgenic lines over-expressing HA-tagged TF genes, which are suited for screening approaches to define TF functions in stress defense and development. (2) A high-throughput microtiter plate based protoplast trans activation (PTA) system has been established to screen for TFs which are regulating a given promoter:Luciferase construct in planta. PMID:22645547
Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach
Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.
2013-01-01
Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756
Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.
Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H
2013-01-09
Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.
Pai, Priyadarshini P; Mondal, Sukanta
2017-01-01
Enzymes are biological catalysts that play an important role in determining the patterns of chemical transformations pertaining to life. Many milestones have been achieved in unraveling the mechanisms in which the enzymes orchestrate various cellular processes using experimental and computational approaches. Experimental studies generating nearly all possible mutations of target enzymes have been aided by rapid computational approaches aiming at enzyme functional classification, understanding domain organization, functional site identification. The functional architecture, essentially, is involved in binding or interaction with ligands including substrates, products, cofactors, inhibitors, providing for their function, such as in catalysis, ligand mediated cell signaling, allosteric regulation and post-translational modifications. With the increasing availability of enzyme information and advances in algorithm development, computational approaches have now become more capable of providing precise inputs for enzyme engineering, and in the process also making it more efficient. This has led to interesting findings, especially in aberrant enzyme interactions, such as hostpathogen interactions in infection, neurodegenerative diseases, cancer and diabetes. This review aims to summarize in retrospection - the mined knowledge, vivid perspectives and challenging strides in using available experimentally validated enzyme information for characterization. An analytical outlook is presented on the scope of exploring future directions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Integration of fMRI, NIROT and ERP for studies of human brain function.
Gore, John C; Horovitz, Silvina G; Cannistraci, Christopher J; Skudlarski, Pavel
2006-05-01
Different methods of assessing human brain function possess specific advantages and disadvantages compared to others, but it is believed that combining different approaches will provide greater information than can be obtained from each alone. For example, functional magnetic resonance imaging (fMRI) has good spatial resolution but poor temporal resolution, whereas the converse is true for electrophysiological recordings (event-related potentials or ERPs). In this review of recent work, we highlight a novel approach to combining these modalities in a manner designed to increase information on the origins and locations of the generators of specific ERPs and the relationship between fMRI and ERP signals. Near infrared imaging techniques have also been studied as alternatives to fMRI and can be readily integrated with simultaneous electrophysiological recordings. Each of these modalities may in principle be also used in so-called steady-state acquisitions in which the correlational structure of signals from the brain may be analyzed to provide new insights into brain function.
Vij, Manika; Grover, Ritika; Gotherwal, Vishvabandhu; Wani, Naiem Ahmad; Joshi, Prashant; Gautam, Hemlata; Sharma, Kanupriya; Chandna, Sudhir; Gokhale, Rajesh S; Rai, Rajkishor; Ganguli, Munia; Natarajan, Vivek T
2016-09-12
Melanin and related polydopamine hold great promise; however, restricted fine-tunabilility limits their usefulness in biocompatible applications. In the present study, by taking a biomimetic approach, we synthesize peptide-derived melanin with a range of physicochemical properties. Characterization of these melanin polymers indicates that they exist as nanorange materials with distinct size distribution, shapes, and surface charges. These variants demonstrate similar absorption spectra but have different optical properties that correlate with particle size. Our approach enables incorporation of chemical groups to create functionalized polyvalent organic nanomaterials and enables customization of melanin. Further, we establish that these synthetic variants are efficiently taken up by the skin keratinocytes, display appreciable photoprotection with minimal cytotoxicity, and thereby function as effective color matched photoprotective agents. In effect we demonstrate that an array of functionalized melanins with distinct properties could be synthesized using bioinspired green chemistry, and these are of immense utility in generating customized melanin/polydopamine like materials.
Rosenthal, Gideon; Váša, František; Griffa, Alessandra; Hagmann, Patric; Amico, Enrico; Goñi, Joaquín; Avidan, Galia; Sporns, Olaf
2018-06-05
Connectomics generates comprehensive maps of brain networks, represented as nodes and their pairwise connections. The functional roles of nodes are defined by their direct and indirect connectivity with the rest of the network. However, the network context is not directly accessible at the level of individual nodes. Similar problems in language processing have been addressed with algorithms such as word2vec that create embeddings of words and their relations in a meaningful low-dimensional vector space. Here we apply this approach to create embedded vector representations of brain networks or connectome embeddings (CE). CE can characterize correspondence relations among brain regions, and can be used to infer links that are lacking from the original structural diffusion imaging, e.g., inter-hemispheric homotopic connections. Moreover, we construct predictive deep models of functional and structural connectivity, and simulate network-wide lesion effects using the face processing system as our application domain. We suggest that CE offers a novel approach to revealing relations between connectome structure and function.
NASA Technical Reports Server (NTRS)
Sorenson, R. L.; Steger, J. L.
1983-01-01
An algorithm for generating computational grids about arbitrary three-dimensional bodies is developed. The elliptic partial differential equation (PDE) approach developed by Steger and Sorenson and used in the NASA computer program GRAPE is extended from two to three dimensions. Forcing functions which are found automatically by the algorithm give the user the ability to control mesh cell size and skewness at boundary surfaces. This algorithm, as is typical of PDE grid generators, gives smooth grid lines and spacing in the interior of the grid. The method is applied to a rectilinear wind-tunnel case and to two body shapes in spherical coordinates.
Full counting statistics of conductance for disordered systems
NASA Astrophysics Data System (ADS)
Fu, Bin; Zhang, Lei; Wei, Yadong; Wang, Jian
2017-09-01
Quantum transport is a stochastic process in nature. As a result, the conductance is fully characterized by its average value and fluctuations, i.e., characterized by full counting statistics (FCS). Since disorders are inevitable in nanoelectronic devices, it is important to understand how FCS behaves in disordered systems. The traditional approach dealing with fluctuations or cumulants of conductance uses diagrammatic perturbation expansion of the Green's function within coherent potential approximation (CPA), which is extremely complicated especially for high order cumulants. In this paper, we develop a theoretical formalism based on nonequilibrium Green's function by directly taking the disorder average on the generating function of FCS of conductance within CPA. This is done by mapping the problem into higher dimensions so that the functional dependence of generating a function on the Green's function becomes linear and the diagrammatic perturbation expansion is not needed anymore. Our theory is very simple and allows us to calculate cumulants of conductance at any desired order efficiently. As an application of our theory, we calculate the cumulants of conductance up to fifth order for disordered systems in the presence of Anderson and binary disorders. Our numerical results of cumulants of conductance show remarkable agreement with that obtained by the brute force calculation.
Method and apparatus for measuring response time
Johanson, Edward W.; August, Charles
1985-01-01
A method of measuring the response time of an electrical instrument which generates an output signal in response to the application of a specified input, wherein the output signal varies as a function of time and when subjected to a step input approaches a steady-state value, comprises the steps of: (a) applying a step input of predetermined value to the electrical instrument to generate an output signal; (b) simultaneously starting a timer; (c) comparing the output signal to a reference signal to generate a stop signal when the output signal is substantially equal to the reference signal, the reference signal being a specified percentage of the steady-state value of the output signal corresponding to the predetermined value of the step input; and (d) applying the stop signal when generated to stop the timer.
Method and apparatus for measuring response time
Johanson, E.W.; August, C.
1983-08-11
A method of measuring the response time of an electrical instrument which generates an output signal in response to the application of a specified input, wherein the output signal varies as a function of time and when subjected to a step input approaches a steady-state value, comprises the steps of: (a) applying a step input of predetermined value to the electrical instrument to generate an output signal; (b) simultaneously starting a timer; (c) comparing the output signal to a reference signal to generate a stop signal when the output signal is substantially equal to the reference signal, the reference signal being a specified percentage of the steady-state value of the output signal corresponding to the predetermined value of the step input; and (d) applying the stop signal when generated to stop the timer.
Generative models for network neuroscience: prospects and promise
Betzel, Richard F.
2017-01-01
Network neuroscience is the emerging discipline concerned with investigating the complex patterns of interconnections found in neural systems, and identifying principles with which to understand them. Within this discipline, one particularly powerful approach is network generative modelling, in which wiring rules are algorithmically implemented to produce synthetic network architectures with the same properties as observed in empirical network data. Successful models can highlight the principles by which a network is organized and potentially uncover the mechanisms by which it grows and develops. Here, we review the prospects and promise of generative models for network neuroscience. We begin with a primer on network generative models, with a discussion of compressibility and predictability, and utility in intuiting mechanisms, followed by a short history on their use in network science, broadly. We then discuss generative models in practice and application, paying particular attention to the critical need for cross-validation. Next, we review generative models of biological neural networks, both at the cellular and large-scale level, and across a variety of species including Caenorhabditis elegans, Drosophila, mouse, rat, cat, macaque and human. We offer a careful treatment of a few relevant distinctions, including differences between generative models and null models, sufficiency and redundancy, inferring and claiming mechanism, and functional and structural connectivity. We close with a discussion of future directions, outlining exciting frontiers both in empirical data collection efforts as well as in method and theory development that, together, further the utility of the generative network modelling approach for network neuroscience. PMID:29187640
Shear Elasticity and Shear Viscosity Imaging in Soft Tissue
NASA Astrophysics Data System (ADS)
Yang, Yiqun
In this thesis, a new approach is introduced that provides estimates of shear elasticity and shear viscosity using time-domain measurements of shear waves in viscoelastic media. Simulations of shear wave particle displacements induced by an acoustic radiation force are accelerated significantly by a GPU. The acoustic radiation force is first calculated using the fast near field method (FNM) and the angular spectrum approach (ASA). The shear waves induced by the acoustic radiation force are then simulated in elastic and viscoelastic media using Green's functions. A parallel algorithm is developed to perform these calculations on a GPU, where the shear wave particle displacements at different observation points are calculated in parallel. The resulting speed increase enables rapid evaluation of shear waves at discrete points, in 2D planes, and for push beams with different spatial samplings and for different values of the f-number (f/#). The results of these simulations show that push beams with smaller f/# require a higher spatial sampling rate. The significant amount of acceleration achieved by this approach suggests that shear wave simulations with the Green's function approach are ideally suited for high-performance GPUs. Shear wave elasticity imaging determines the mechanical parameters of soft tissue by analyzing measured shear waves induced by an acoustic radiation force. To estimate the shear elasticity value, the widely used time-of-flight method calculates the correlation between shear wave particle velocities at adjacent lateral observation points. Although this method provides accurate estimates of the shear elasticity in purely elastic media, our experience suggests that the time-of-flight (TOF) method consistently overestimates the shear elasticity values in viscoelastic media because the combined effects of diffraction, attenuation, and dispersion are not considered. To address this problem, we have developed an approach that directly accounts for all of these effects when estimating the shear elasticity. This new approach simulates shear wave particle velocities using a Green's function-based approach for the Voigt model, where the shear elasticity and viscosity values are estimated using an optimization-based approach that compares measured shear wave particle velocities with simulated shear wave particle velocities in the time-domain. The results are evaluated on a point-by-point basis to generate images. There is good agreement between the simulated and measured shear wave particle velocities, where the new approach yields much better images of the shear elasticity and shear viscosity than the TOF method. The new estimation approach is accelerated with an approximate viscoelastic Green's function model that is evaluated with shear wave data obtained from in vivo human livers. Instead of calculating shear waves with combinations of different shear elasticities and shear viscosities, shear waves are calculated with different shear elasticities on the GPU and then convolved with a viscous loss model, which accelerates the calculation dramatically. The shear elasticity and shear viscosity values are then estimated using an optimization-based approach by minimizing the difference between measured and simulated shear wave particle velocities. Shear elasticity and shear viscosity images are generated at every spatial point in a two-dimensional (2D) field-of-view (FOV). The new approach is applied to measured shear wave data obtained from in vivo human livers, and the results show that this new approach successfully generates shear elasticity and shear viscosity images from this data. The results also indicate that the shear elasticity values estimated with this approach are significantly smaller than the values estimated with the conventional TOF method and that the new approach demonstrates more consistent values for these estimates compared with the TOF method. This experience suggests that the new method is an effective approach for estimating the shear elasticity and the shear viscosity in liver and in other soft tissue.
NASA Astrophysics Data System (ADS)
Khellat, M. R.; Mirjalili, A.
2017-03-01
We first consider the idea of renormalization group-induced estimates, in the context of optimization procedures, for the Brodsky-Lepage-Mackenzie approach to generate higher-order contributions to QCD perturbative series. Secondly, we develop the deviation pattern approach (DPA) in which through a series of comparisons between lowerorder RG-induced estimates and the corresponding analytical calculations, one could modify higher-order RG-induced estimates. Finally, using the normal estimation procedure and DPA, we get estimates of αs4 corrections for the Bjorken sum rule of polarized deep-inelastic scattering and for the non-singlet contribution to the Adler function.
Influence of surface defects on the tensile strength of carbon fibers
NASA Astrophysics Data System (ADS)
Vautard, F.; Dentzer, J.; Nardin, M.; Schultz, J.; Defoort, B.
2014-12-01
The mechanical properties of carbon fibers, especially their tensile properties, are affected by internal and surface defects. In order to asses in what extent the generation of surface defects can result in a loss of the mechanical properties, non-surface treated carbon fibers were oxidized with three different surface treatment processes: electro-chemical oxidation, oxidation in nitric acid, and oxidation in oxygen plasma. Different surface topographies and surface chemistries were obtained, as well as different types and densities of surface defects. The density of surface defects was measured with both a physical approach (Raman spectroscopy) and a chemical approach (Active Surface Area). The tensile properties were evaluated by determining the Weibull modulus and the scale parameter of each reference, after measuring the tensile strength for four different gauge lengths. A relationship between the tensile properties and the nature and density of surface defects was noticed, as large defects largely control the value of the tensile strength. When optimized, some oxidation surface treatment processes can generate surface functional groups as well as an increase of the mechanical properties of the fibers, because of the removal of the contamination layer of pyrolytic carbon generated during the carbonization of the polyacrylonitrile precursor. Oxidation in oxygen plasma revealed to be a promising technology for alternative surface treatment processes, as high levels of functionalization were achieved and a slight improvement of the mechanical properties was obtained too.
Zhang, Wenjun; Wang, Ming L.; Khalili, Sammy
2016-01-01
Abstract We live in exciting times for a new generation of biomarkers being enabled by advances in the design and use of biomaterials for medical and clinical applications, from nano- to macro-materials, and protein to tissue. Key challenges arise, however, due to both scientific complexity and compatibility of the interface of biology and engineered materials. The linking of mechanisms across scales by using a materials science approach to provide structure–process–property relations characterizes the emerging field of ‘materiomics,’ which offers enormous promise to provide the hitherto missing tools for biomaterial development for clinical diagnostics and the next generation biomarker applications towards personal health monitoring. Put in other words, the emerging field of materiomics represents an essentially systematic approach to the investigation of biological material systems, integrating natural functions and processes with traditional materials science perspectives. Here we outline how materiomics provides a game-changing technology platform for disruptive innovation in biomaterial science to enable the design of tailored and functional biomaterials—particularly, the design and screening of DNA aptamers for targeting biomarkers related to oral diseases and oral health monitoring. Rigorous and complementary computational modeling and experimental techniques will provide an efficient means to develop new clinical technologies in silico, greatly accelerating the translation of materiomics-driven oral health diagnostics from concept to practice in the clinic. PMID:26760957
Random field assessment of nanoscopic inhomogeneity of bone.
Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu
2010-12-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Rabab'ah, Ghaleb
2013-01-01
This study explores the discourse generated by English as a foreign language (EFL) learners using synchronous computer-mediated communication (CMC) as an approach to help English language learners to create social interaction in the classroom. It investigates the impact of synchronous CMC mode on the quantity of total words, lexical range and…
Sequential experimental design based generalised ANOVA
NASA Astrophysics Data System (ADS)
Chakraborty, Souvik; Chowdhury, Rajib
2016-07-01
Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.
Sequential experimental design based generalised ANOVA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in
Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less
A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis
Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan
2009-01-01
DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301
Modeling of dielectric elastomer oscillators for soft biomimetic applications.
Henke, E-F M; Wilson, Katherine E; Anderson, I A
2018-06-26
Biomimetic, entirely soft robots with animal-like behavior and integrated artificial nervous systems will open up totally new perspectives and applications. However, until now, most presented studies on soft robots were limited to only partly soft designs, since all solutions at least needed conventional, stiff electronics to sense, process signals and activate actuators. We present a novel approach for a set up and the experimental validation of an artificial pace maker that is able to drive basic robotic structures and act as artificial central pattern generator. The structure is based on multi-functional dielectric elastomers (DEs). DE actuators, DE switches and DE resistors are combined to create complex DE oscillators (DEOs). Supplied with only one external DC voltage, the DEO autonomously generates oscillating signals that can be used to clock a robotic structure, control the cyclic motion of artificial muscles in bionic robots or make a whole robotic structure move. We present the basic functionality, derive a mathematical model for predicting the generated signal waveform and verify the model experimentally.
Bannar-Martin, Katherine H; Kremer, Colin T; Ernest, S K Morgan; Leibold, Mathew A; Auge, Harald; Chase, Jonathan; Declerck, Steven A J; Eisenhauer, Nico; Harpole, Stanley; Hillebrand, Helmut; Isbell, Forest; Koffel, Thomas; Larsen, Stefano; Narwani, Anita; Petermann, Jana S; Roscher, Christiane; Cabral, Juliano Sarmento; Supp, Sarah R
2018-02-01
The research of a generation of ecologists was catalysed by the recognition that the number and identity of species in communities influences the functioning of ecosystems. The relationship between biodiversity and ecosystem functioning (BEF) is most often examined by controlling species richness and randomising community composition. In natural systems, biodiversity changes are often part of a bigger community assembly dynamic. Therefore, focusing on community assembly and the functioning of ecosystems (CAFE), by integrating both species richness and composition through species gains, losses and changes in abundance, will better reveal how community changes affect ecosystem function. We synthesise the BEF and CAFE perspectives using an ecological application of the Price equation, which partitions the contributions of richness and composition to function. Using empirical examples, we show how the CAFE approach reveals important contributions of composition to function. These examples show how changes in species richness and composition driven by environmental perturbations can work in concert or antagonistically to influence ecosystem function. Considering how communities change in an integrative fashion, rather than focusing on one axis of community structure at a time, will improve our ability to anticipate and predict changes in ecosystem function. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.
FUSE: a profit maximization approach for functional summarization of biological networks.
Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes; Yu, Hanry
2012-03-21
The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein interaction network (PPI) using graph theoretic analysis. Despite the recent progress, systems level analysis of PPIS remains a daunting task as it is challenging to make sense out of the deluge of high-dimensional interaction data. Specifically, techniques that automatically abstract and summarize PPIS at multiple resolutions to provide high level views of its functional landscape are still lacking. We present a novel data-driven and generic algorithm called FUSE (Functional Summary Generator) that generates functional maps of a PPI at different levels of organization, from broad process-process level interactions to in-depth complex-complex level interactions, through a pro t maximization approach that exploits Minimum Description Length (MDL) principle to maximize information gain of the summary graph while satisfying the level of detail constraint. We evaluate the performance of FUSE on several real-world PPIS. We also compare FUSE to state-of-the-art graph clustering methods with GO term enrichment by constructing the biological process landscape of the PPIS. Using AD network as our case study, we further demonstrate the ability of FUSE to quickly summarize the network and identify many different processes and complexes that regulate it. Finally, we study the higher-order connectivity of the human PPI. By simultaneously evaluating interaction and annotation data, FUSE abstracts higher-order interaction maps by reducing the details of the underlying PPI to form a functional summary graph of interconnected functional clusters. Our results demonstrate its effectiveness and superiority over state-of-the-art graph clustering methods with GO term enrichment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Junjing; Hong, Young Pyo; Chen, Si
Modern integrated circuits (ICs) employ a myriad of materials organized at nanoscale dimensions, and certain critical tolerances must be met for them to function. To understand departures from intended functionality, it is essential to examine ICs as manufactured so as to adjust design rules ideally in a nondestructive way so that imaged structures can be correlated with electrical performance. Electron microscopes can do this on thin regions or on exposed surfaces, but the required processing alters or even destroys functionality. Microscopy with multi-keV x-rays provides an alternative approach with greater penetration, but the spatial resolution of x-ray imaging lenses hasmore » not allowed one to see the required detail in the latest generation of ICs. X-ray ptychography provides a way to obtain images of ICs without lens-imposed resolution limits with past work delivering 20–40-nm resolution on thinned ICs. We describe a simple model for estimating the required exposure and use it to estimate the future potential for this technique. Here we show that this approach can be used to image circuit detail through an unprocessed 300-μm-thick silicon wafer with sub-20-nm detail clearly resolved after mechanical polishing to 240-μm thickness was used to eliminate image contrast caused by Si wafer surface scratches. Here, by using continuous x-ray scanning, massively parallel computation, and a new generation of synchrotron light sources, this should enable entire nonetched ICs to be imaged to 10-nm resolution or better while maintaining their ability to function in electrical tests.« less
Iglesias, Daniel; Senokos, Evgeny; Alemán, Belén; Cabana, Laura; Navío, Cristina; Marcilla, Rebeca; Prato, Maurizio; Vilatela, Juan J; Marchesan, Silvia
2018-02-14
The assembly of aligned carbon nanotubes (CNTs) into fibers (CNTFs) is a convenient approach to exploit and apply the unique physico-chemical properties of CNTs in many fields. CNT functionalization has been extensively used for its implementation into composites and devices. However, CNTF functionalization is still in its infancy because of the challenges associated with preservation of CNTF morphology. Here, we report a thorough study of the gas-phase functionalization of CNTF assemblies using ozone which was generated in situ from a UV source. In contrast with liquid-based oxidation methods, this gas-phase approach preserves CNTF morphology, while notably increasing its hydrophilicity. The functionalized material is thoroughly characterized by Raman spectroscopy, X-ray photoelectron spectroscopy, transmission electron microscopy, and scanning electron microscopy. Its newly acquired hydrophilicity enables CNTF electrochemical characterization in aqueous media, which was not possible for the pristine material. Through comparison of electrochemical measurements in aqueous electrolytes and ionic liquids, we decouple the effects of functionalization on pseudocapacitive reactions and quantum capacitance. The functionalized CNTF assembly is successfully used as an active material and a current collector in all-solid supercapacitor flexible devices with an ionic liquid-based polymer electrolyte.
Mills, W; Critcher, R; Lee, C; Farr, C J
1999-05-01
A linear mammalian artificial chromosome (MAC) will require at least three types of functional element: a centromere, two telomeres and origins of replication. As yet, our understanding of these elements, as well as many other aspects of structure and organization which may be critical for a fully functional mammalian chromosome, remains poor. As a way of defining these various requirements, minichromosome reagents are being developed and analysed. Approaches for minichromosome generation fall into two broad categories: de novo assembly from candidate DNA sequences, or the fragmentation of an existing chromosome to reduce it to a minimal size. Here we describe the generation of a human minichromosome using the latter, top-down, approach. A human X chromosome, present in a DT40-human microcell hybrid, has been manipulated using homologous recombination and the targeted seeding of a de novo telomere. This strategy has generated a linear approximately 2.4 Mb human X centromere-based minichromosome capped by two artificially seeded telomeres: one immediately flanking the centromeric alpha-satellite DNA and the other targeted to the zinc finger gene ZXDA in Xp11.21. The chromosome retains an alpha-satellite domain of approximately 1. 8 Mb, a small array of gamma-satellite repeat ( approximately 40 kb) and approximately 400 kb of Xp proximal DNA sequence. The mitotic stability of this minichromosome has been examined, both in DT40 and following transfer into hamster and human cell lines. In all three backgrounds, the minichromosome is retained efficiently, but in the human and hamster microcell hybrids its copy number is poorly regulated. This approach of engineering well-defined chromosome reagents will allow key questions in MAC development (such as whether a lower size limit exists) to be addressed. In addition, the 2.4 Mb minichromosome described here has potential to be developed as a vector for gene delivery.
A generating function approach to HIV transmission with dynamic contact rates
Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.
2014-04-24
The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less
A generating function approach to HIV transmission with dynamic contact rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.
The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less
Bashor, Caleb J; Horwitz, Andrew A; Peisajovich, Sergio G; Lim, Wendell A
2010-01-01
The living cell is an incredibly complex entity, and the goal of predictively and quantitatively understanding its function is one of the next great challenges in biology. Much of what we know about the cell concerns its constituent parts, but to a great extent we have yet to decode how these parts are organized to yield complex physiological function. Classically, we have learned about the organization of cellular networks by disrupting them through genetic or chemical means. The emerging discipline of synthetic biology offers an additional, powerful approach to study systems. By rearranging the parts that comprise existing networks, we can gain valuable insight into the hierarchical logic of the networks and identify the modular building blocks that evolution uses to generate innovative function. In addition, by building minimal toy networks, one can systematically explore the relationship between network structure and function. Here, we outline recent work that uses synthetic biology approaches to investigate the organization and function of cellular networks, and describe a vision for a synthetic biology toolkit that could be used to interrogate the design principles of diverse systems.
Bannasch, Detlev; Mehrle, Alexander; Glatting, Karl-Heinz; Pepperkok, Rainer; Poustka, Annemarie; Wiemann, Stefan
2004-01-01
We have implemented LIFEdb (http://www.dkfz.de/LIFEdb) to link information regarding novel human full-length cDNAs generated and sequenced by the German cDNA Consortium with functional information on the encoded proteins produced in functional genomics and proteomics approaches. The database also serves as a sample-tracking system to manage the process from cDNA to experimental read-out and data interpretation. A web interface enables the scientific community to explore and visualize features of the annotated cDNAs and ORFs combined with experimental results, and thus helps to unravel new features of proteins with as yet unknown functions. PMID:14681468
How can we estimate natural selection on endocrine traits? Lessons from evolutionary biology
2016-01-01
An evolutionary perspective can enrich almost any endeavour in biology, providing a deeper understanding of the variation we see in nature. To this end, evolutionary endocrinologists seek to describe the fitness consequences of variation in endocrine traits. Much of the recent work in our field, however, follows a flawed approach to the study of how selection shapes endocrine traits. Briefly, this approach relies on among-individual correlations between endocrine phenotypes (often circulating hormone levels) and fitness metrics to estimate selection on those endocrine traits. Adaptive plasticity in both endocrine and fitness-related traits can drive these correlations, generating patterns that do not accurately reflect natural selection. We illustrate why this approach to studying selection on endocrine traits is problematic, referring to work from evolutionary biologists who, decades ago, described this problem as it relates to a variety of other plastic traits. We extend these arguments to evolutionary endocrinology, where the likelihood that this flaw generates bias in estimates of selection is unusually high due to the exceptional responsiveness of hormones to environmental conditions, and their function to induce adaptive life-history responses to environmental variation. We end with a review of productive approaches for investigating the fitness consequences of variation in endocrine traits that we expect will generate exciting advances in our understanding of endocrine system evolution. PMID:27881753
Adapted random sampling patterns for accelerated MRI.
Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf
2011-02-01
Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.
Human Cardiomyocytes Prior to Birth by Integration-Free Reprogramming of Amniotic Fluid Cells
Jiang, Guihua; Herron, Todd J.; Di Bernardo, Julie; Walker, Kendal A.; O’Shea, K. Sue
2016-01-01
The establishment of an abundant source of autologous cardiac progenitor cells would represent a major advance toward eventual clinical translation of regenerative medicine strategies in children with prenatally diagnosed congenital heart disease. In support of this concept, we sought to examine whether functional, transgene-free human cardiomyocytes (CMs) with potential for patient-specific and autologous applications could be reliably generated following routine amniocentesis. Under institutional review board approval, amniotic fluid specimens (8–10 ml) at 20 weeks gestation were expanded and reprogrammed toward pluripotency using nonintegrating Sendai virus (SeV) expressing OCT4, SOX2, cMYC, and KLF4. Following exposure of these induced pluripotent stem cells to cardiogenic differentiation conditions, spontaneously beating amniotic fluid-derived cardiomyocytes (AF-CMs) were successfully generated with high efficiency. After 6 weeks, quantitative gene expression revealed a mixed population of differentiated atrial, ventricular, and nodal AF-CMs, as demonstrated by upregulation of multiple cardiac markers, including MYH6, MYL7, TNNT2, TTN, and HCN4, which were comparable to levels expressed by neonatal dermal fibroblast-derived CM controls. AF-CMs had a normal karyotype and demonstrated loss of NANOG, OCT4, and the SeV transgene. Functional characterization of SIRPA+ AF-CMs showed a higher spontaneous beat frequency in comparison with dermal fibroblast controls but revealed normal calcium transients and appropriate chronotropic responses after β-adrenergic agonist stimulation. Taken together, these data suggest that somatic cells present within human amniotic fluid can be used to generate a highly scalable source of functional, transgene-free, autologous CMs before a child is born. This approach may be ideally suited for patients with prenatally diagnosed cardiac anomalies. Significance This study presents transgene-free human amniotic fluid-derived cardiomyocytes (AF-CMs) for potential therapy in tissue engineering and regenerative medicine applications. Using 8–10 ml of amniotic fluid harvested at 20 weeks gestation from normal pregnancies, a mixed population of atrial, ventricular, and nodal AF-CMs were reliably generated after Sendai virus reprogramming toward pluripotency. Functional characterization of purified populations of beating AF-CMs revealed normal calcium transients and appropriate chronotropic responses after β-adrenergic agonist stimulation in comparison with dermal fibroblast controls. Because AF-CMs can be generated in fewer than 16 weeks, this approach may be ideally suited for eventual clinical translation at birth in children with prenatally diagnosed cardiac anomalies. PMID:27465073
Roebroek, Anton J M; Van Gool, Bart
2014-01-01
Molecular genetic strategies applying embryonic stem cell (ES cell) technologies to study the function of a gene in mice or to generate a mouse model for a human disease are continuously under development. Next to (conditional) inactivation of genes the application and importance of approaches to generate knock-in mutations are increasing. In this chapter the principle and application of recombinase-mediated cassette exchange (RMCE) are discussed as being a new emerging knock-in strategy, which enables easy generation of a series of different knock-in mutations within one gene. An RMCE protocol, which was used to generate a series of different knock-in mutations in the Lrp1 gene of ES cells, is described in detail as an example of how RMCE can be used to generate highly efficiently an allelic series of differently modified ES cell clones from a parental modified ES cell clone. Subsequently the differently modified ES cell clones can be used to generate an allelic series of mutant knock-in mice.
Improved Evolutionary Programming with Various Crossover Techniques for Optimal Power Flow Problem
NASA Astrophysics Data System (ADS)
Tangpatiphan, Kritsana; Yokoyama, Akihiko
This paper presents an Improved Evolutionary Programming (IEP) for solving the Optimal Power Flow (OPF) problem, which is considered as a non-linear, non-smooth, and multimodal optimization problem in power system operation. The total generator fuel cost is regarded as an objective function to be minimized. The proposed method is an Evolutionary Programming (EP)-based algorithm with making use of various crossover techniques, normally applied in Real Coded Genetic Algorithm (RCGA). The effectiveness of the proposed approach is investigated on the IEEE 30-bus system with three different types of fuel cost functions; namely the quadratic cost curve, the piecewise quadratic cost curve, and the quadratic cost curve superimposed by sine component. These three cost curves represent the generator fuel cost functions with a simplified model and more accurate models of a combined-cycle generating unit and a thermal unit with value-point loading effect respectively. The OPF solutions by the proposed method and Pure Evolutionary Programming (PEP) are observed and compared. The simulation results indicate that IEP requires less computing time than PEP with better solutions in some cases. Moreover, the influences of important IEP parameters on the OPF solution are described in details.
Hybrid fuzzy cluster ensemble framework for tumor clustering from biomolecular data.
Yu, Zhiwen; Chen, Hantao; You, Jane; Han, Guoqiang; Li, Le
2013-01-01
Cancer class discovery using biomolecular data is one of the most important tasks for cancer diagnosis and treatment. Tumor clustering from gene expression data provides a new way to perform cancer class discovery. Most of the existing research works adopt single-clustering algorithms to perform tumor clustering is from biomolecular data that lack robustness, stability, and accuracy. To further improve the performance of tumor clustering from biomolecular data, we introduce the fuzzy theory into the cluster ensemble framework for tumor clustering from biomolecular data, and propose four kinds of hybrid fuzzy cluster ensemble frameworks (HFCEF), named as HFCEF-I, HFCEF-II, HFCEF-III, and HFCEF-IV, respectively, to identify samples that belong to different types of cancers. The difference between HFCEF-I and HFCEF-II is that they adopt different ensemble generator approaches to generate a set of fuzzy matrices in the ensemble. Specifically, HFCEF-I applies the affinity propagation algorithm (AP) to perform clustering on the sample dimension and generates a set of fuzzy matrices in the ensemble based on the fuzzy membership function and base samples selected by AP. HFCEF-II adopts AP to perform clustering on the attribute dimension, generates a set of subspaces, and obtains a set of fuzzy matrices in the ensemble by performing fuzzy c-means on subspaces. Compared with HFCEF-I and HFCEF-II, HFCEF-III and HFCEF-IV consider the characteristics of HFCEF-I and HFCEF-II. HFCEF-III combines HFCEF-I and HFCEF-II in a serial way, while HFCEF-IV integrates HFCEF-I and HFCEF-II in a concurrent way. HFCEFs adopt suitable consensus functions, such as the fuzzy c-means algorithm or the normalized cut algorithm (Ncut), to summarize generated fuzzy matrices, and obtain the final results. The experiments on real data sets from UCI machine learning repository and cancer gene expression profiles illustrate that 1) the proposed hybrid fuzzy cluster ensemble frameworks work well on real data sets, especially biomolecular data, and 2) the proposed approaches are able to provide more robust, stable, and accurate results when compared with the state-of-the-art single clustering algorithms and traditional cluster ensemble approaches.
The Design of a Practical Enterprise Safety Management System
NASA Astrophysics Data System (ADS)
Gabbar, Hossam A.; Suzuki, Kazuhiko
This book presents design guidelines and implementation approaches for enterprise safety management system as integrated within enterprise integrated systems. It shows new model-based safety management where process design automation is integrated with enterprise business functions and components. It proposes new system engineering approach addressed to new generation chemical industry. It will help both the undergraduate and professional readers to build basic knowledge about issues and problems of designing practical enterprise safety management system, while presenting in clear way, the system and information engineering practices to design enterprise integrated solution.
Shape Optimization of Rubber Bushing Using Differential Evolution Algorithm
2014-01-01
The objective of this study is to design rubber bushing at desired level of stiffness characteristics in order to achieve the ride quality of the vehicle. A differential evolution algorithm based approach is developed to optimize the rubber bushing through integrating a finite element code running in batch mode to compute the objective function values for each generation. Two case studies were given to illustrate the application of proposed approach. Optimum shape parameters of 2D bushing model were determined by shape optimization using differential evolution algorithm. PMID:25276848
Applications of the CRISPR-Cas9 system in cancer biology
Sánchez-Rivera, Francisco J.; Jacks, Tyler
2015-01-01
Preface The prokaryotic type II clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 system is rapidly revolutionizing the field of genetic engineering, allowing researchers to alter the genomes of a large variety of organisms with relative ease. Experimental approaches based on this versatile technology have the potential to transform the field of cancer genetics. Here we review current approaches based on CRISPR-Cas9 for functional studies of cancer genes, with emphasis on its applicability for the development of the next-generation models of human cancer. PMID:26040603
Chacko, Anil; Kofler, Michael; Jarrett, Matthew
2014-01-01
Attention-deficit/hyperactivity disorder (ADHD) is a prevalent and chronic mental health condition that often results in substantial impairments throughout life. Although evidence-based pharmacological and psychosocial treatments exist for ADHD, effects of these treatments are acute, do not typically generalize into non-treated settings, rarely sustain over time, and insufficiently affect key areas of functional impairment (i.e., family, social, and academic functioning) and executive functioning. The limitations of current evidence-based treatments may be due to the inability of these treatments to address underlying neurocognitive deficits that are related to the symptoms of ADHD and associated areas of functional impairment. Although efforts have been made to directly target the underlying neurocognitive deficits of ADHD, extant neurocognitive interventions have shown limited efficacy, possibly due to misspecification of training targets and inadequate potency. We argue herein that despite these limitations, next-generation neurocognitive training programs that more precisely and potently target neurocognitive deficits may lead to optimal outcomes when used in combination with specific skill-based psychosocial treatments for ADHD. We discuss the rationale for such a combined treatment approach, prominent examples of this combined treatment approach for other mental health disorders, and potential combined treatment approaches for pediatric ADHD. Finally, we conclude with directions for future research necessary to develop a combined neurocognitive + skill-based treatment for youth with ADHD. PMID:25120200
Loop series for discrete statistical models on graphs
NASA Astrophysics Data System (ADS)
Chertkov, Michael; Chernyak, Vladimir Y.
2006-06-01
In this paper we present the derivation details, logic, and motivation for the three loop calculus introduced in Chertkov and Chernyak (2006 Phys. Rev. E 73 065102(R)). Generating functions for each of the three interrelated discrete statistical models are expressed in terms of a finite series. The first term in the series corresponds to the Bethe-Peierls belief-propagation (BP) contribution; the other terms are labelled by loops on the factor graph. All loop contributions are simple rational functions of spin correlation functions calculated within the BP approach. We discuss two alternative derivations of the loop series. One approach implements a set of local auxiliary integrations over continuous fields with the BP contribution corresponding to an integrand saddle-point value. The integrals are replaced by sums in the complementary approach, briefly explained in Chertkov and Chernyak (2006 Phys. Rev. E 73 065102(R)). Local gauge symmetry transformations that clarify an important invariant feature of the BP solution are revealed in both approaches. The individual terms change under the gauge transformation while the partition function remains invariant. The requirement for all individual terms to be nonzero only for closed loops in the factor graph (as opposed to paths with loose ends) is equivalent to fixing the first term in the series to be exactly equal to the BP contribution. Further applications of the loop calculus to problems in statistical physics, computer and information sciences are discussed.
Multi-Resolution Unstructured Grid-Generation for Geophysical Applications on the Sphere
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2015-01-01
An algorithm for the generation of non-uniform unstructured grids on ellipsoidal geometries is described. This technique is designed to generate high quality triangular and polygonal meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric and ocean simulation, and numerical weather predication. Using a recently developed Frontal-Delaunay-refinement technique, a method for the construction of high-quality unstructured ellipsoidal Delaunay triangulations is introduced. A dual polygonal grid, derived from the associated Voronoi diagram, is also optionally generated as a by-product. Compared to existing techniques, it is shown that the Frontal-Delaunay approach typically produces grids with near-optimal element quality and smooth grading characteristics, while imposing relatively low computational expense. Initial results are presented for a selection of uniform and non-uniform ellipsoidal grids appropriate for large-scale geophysical applications. The use of user-defined mesh-sizing functions to generate smoothly graded, non-uniform grids is discussed.
Setoh, Yin Xiang; Prow, Natalie A.; Peng, Nias; Hugo, Leon E.; Devine, Gregor; Hazlewood, Jessamine E.
2017-01-01
ABSTRACT Zika virus (ZIKV) has recently emerged and is the etiological agent of congenital Zika syndrome (CZS), a spectrum of congenital abnormalities arising from neural tissue infections in utero. Herein, we describe the de novo generation of a new ZIKV isolate, ZIKVNatal, using a modified circular polymerase extension reaction protocol and sequence data obtained from a ZIKV-infected fetus with microcephaly. ZIKVNatal thus has no laboratory passage history and is unequivocally associated with CZS. ZIKVNatal could be used to establish a fetal brain infection model in IFNAR−/− mice (including intrauterine growth restriction) without causing symptomatic infections in dams. ZIKVNatal was also able to be transmitted by Aedes aegypti mosquitoes. ZIKVNatal thus retains key aspects of circulating pathogenic ZIKVs and illustrates a novel methodology for obtaining an authentic functional viral isolate by using data from deep sequencing of infected tissues. IMPORTANCE The major complications of an ongoing Zika virus outbreak in the Americas and Asia are congenital defects caused by the virus’s ability to cross the placenta and infect the fetal brain. The ability to generate molecular tools to analyze viral isolates from the current outbreak is essential for furthering our understanding of how these viruses cause congenital defects. The majority of existing viral isolates and infectious cDNA clones generated from them have undergone various numbers of passages in cell culture and/or suckling mice, which is likely to result in the accumulation of adaptive mutations that may affect viral properties. The approach described herein allows rapid generation of new, fully functional Zika virus isolates directly from deep sequencing data from virus-infected tissues without the need for prior virus passaging and for the generation and propagation of full-length cDNA clones. The approach should be applicable to other medically important flaviviruses and perhaps other positive-strand RNA viruses. PMID:28529976
Setoh, Yin Xiang; Prow, Natalie A; Peng, Nias; Hugo, Leon E; Devine, Gregor; Hazlewood, Jessamine E; Suhrbier, Andreas; Khromykh, Alexander A
2017-01-01
Zika virus (ZIKV) has recently emerged and is the etiological agent of congenital Zika syndrome (CZS), a spectrum of congenital abnormalities arising from neural tissue infections in utero . Herein, we describe the de novo generation of a new ZIKV isolate, ZIKV Natal , using a modified circular polymerase extension reaction protocol and sequence data obtained from a ZIKV-infected fetus with microcephaly. ZIKV Natal thus has no laboratory passage history and is unequivocally associated with CZS. ZIKV Natal could be used to establish a fetal brain infection model in IFNAR -/- mice (including intrauterine growth restriction) without causing symptomatic infections in dams. ZIKV Natal was also able to be transmitted by Aedes aegypti mosquitoes. ZIKV Natal thus retains key aspects of circulating pathogenic ZIKVs and illustrates a novel methodology for obtaining an authentic functional viral isolate by using data from deep sequencing of infected tissues. IMPORTANCE The major complications of an ongoing Zika virus outbreak in the Americas and Asia are congenital defects caused by the virus's ability to cross the placenta and infect the fetal brain. The ability to generate molecular tools to analyze viral isolates from the current outbreak is essential for furthering our understanding of how these viruses cause congenital defects. The majority of existing viral isolates and infectious cDNA clones generated from them have undergone various numbers of passages in cell culture and/or suckling mice, which is likely to result in the accumulation of adaptive mutations that may affect viral properties. The approach described herein allows rapid generation of new, fully functional Zika virus isolates directly from deep sequencing data from virus-infected tissues without the need for prior virus passaging and for the generation and propagation of full-length cDNA clones. The approach should be applicable to other medically important flaviviruses and perhaps other positive-strand RNA viruses.
Nadkarni, Tanvi N; Andreoli, Matthew J; Nair, Veena A; Yin, Peng; Young, Brittany M; Kundu, Bornali; Pankratz, Joshua; Radtke, Andrew; Holdsworth, Ryan; Kuo, John S; Field, Aaron S; Baskaya, Mustafa K; Moritz, Chad H; Meyerand, M Elizabeth; Prabhakaran, Vivek
2015-01-01
Functional magnetic resonance imaging (fMRI) is a non-invasive pre-surgical tool used to assess localization and lateralization of language function in brain tumor and vascular lesion patients in order to guide neurosurgeons as they devise a surgical approach to treat these lesions. We investigated the effect of varying the statistical thresholds as well as the type of language tasks on functional activation patterns and language lateralization. We hypothesized that language lateralization indices (LIs) would be threshold- and task-dependent. Imaging data were collected from brain tumor patients (n = 67, average age 48 years) and vascular lesion patients (n = 25, average age 43 years) who received pre-operative fMRI scanning. Both patient groups performed expressive (antonym and/or letter-word generation) and receptive (tumor patients performed text-reading; vascular lesion patients performed text-listening) language tasks. A control group (n = 25, average age 45 years) performed the letter-word generation task. Brain tumor patients showed left-lateralization during the antonym-word generation and text-reading tasks at high threshold values and bilateral activation during the letter-word generation task, irrespective of the threshold values. Vascular lesion patients showed left-lateralization during the antonym and letter-word generation, and text-listening tasks at high threshold values. Our results suggest that the type of task and the applied statistical threshold influence LI and that the threshold effects on LI may be task-specific. Thus identifying critical functional regions and computing LIs should be conducted on an individual subject basis, using a continuum of threshold values with different tasks to provide the most accurate information for surgical planning to minimize post-operative language deficits.
Yin, Yuzhi; Bai, Yun; Olivera, Ana; Desai, Avanti; Metcalfe, Dean D
2017-09-01
The culture of mast cells from human tissues such a cord blood, peripheral blood or bone marrow aspirates has advanced our understanding of human mast cells (huMC) degranulation, mediator production and response to pharmacologic agents. However, existing methods for huMC culture tend to be laborious and expensive. Combining technical approaches from several of these protocols, we designed a simplified and more cost effective approach to the culture of mast cells from human cell populations including peripheral blood and cryopreserved cells from lymphocytapheresis. On average, we reduced by 30-50 fold the amount of culture media compared to our previously reported method, while the total MC number generated by this method (2.46±0.63×10 6 vs. 2.4±0.28×10 6 , respectively, from 1.0×10 8 lymphocytapheresis or peripheral blood mononuclear blood cells [PBMCs]) was similar to our previous method (2.36±0.70×10 6 ), resulting in significant budgetary savings. In addition, we compared the yield of huMCs with or without IL-3 added to early cultures in the presence of stem cell factor (SCF) and interlukin-6 (IL-6) and found that the total MC number generated, while higher with IL-3 in the culture, did not reach statistical significance, suggesting that IL-3, often recommended in the culture of huMCs, is not absolutely required. We then performed a functional analysis by flow cytometry using standard methods and which maximized the data we could obtain from cultured cells. We believe these approaches will allow more laboratories to culture and examine huMC behavior going forward. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Smet, K.; de Neufville, R.; van der Vlist, M.
2017-12-01
This work presents an innovative approach for replacement planning for aging water infrastructure given uncertain future conditions. We draw upon two existing methodologies to develop an integrated long-term replacement planning framework. We first expand the concept of Adaptation Tipping Points to generate long-term planning timelines that incorporate drivers of investment related to both internal structural processes as well as changes in external operating conditions. Then, we use Engineering Options to explore different actions taken at key moments in this timeline. Contrasting to the traditionally more static approach to infrastructure design, designing the next generation of infrastructure so it can be changed incrementally is a promising method to safeguard current investments given future uncertainty. This up-front inclusion of structural options in the system actively facilitates future adaptation, transforming uncertainty management in infrastructure planning from reactive to more proactive. A two-part model underpins this approach. A simulation model generates diverse future conditions, allowing development of timelines of intervention moments in the structure's life. This feeds into an economic model, evaluating the lifetime performance of different replacement strategies, making explicit the value of different designs and their flexibility. A proof of concept study demonstrates this approach for a pumping station. The strategic planning timelines for this structure demonstrate that moments when capital interventions become necessary due to reduced functionality from structural degradation or changed operating conditions are widely spread over the structure's life. The disparate timing of these necessary interventions supports an incremental, adaptive mindset when considering end-of-life and replacement decisions. The analysis then explores different replacement decisions, varying the size and specific options included in the proposed new structure. Results show that incremental adaptive designs and incorporating options can improve economic performance, as compared to traditional, "build it once & build it big" designs. The benefit from incorporating flexibility varies with structural functionality, future conditions and the specific options examined.
McDonald, Jacqueline U.; Kaforou, Myrsini; Clare, Simon; Hale, Christine; Ivanova, Maria; Huntley, Derek; Dorner, Marcus; Wright, Victoria J.; Levin, Michael; Martinon-Torres, Federico; Herberg, Jethro A.
2016-01-01
ABSTRACT Greater understanding of the functions of host gene products in response to infection is required. While many of these genes enable pathogen clearance, some enhance pathogen growth or contribute to disease symptoms. Many studies have profiled transcriptomic and proteomic responses to infection, generating large data sets, but selecting targets for further study is challenging. Here we propose a novel data-mining approach combining multiple heterogeneous data sets to prioritize genes for further study by using respiratory syncytial virus (RSV) infection as a model pathogen with a significant health care impact. The assumption was that the more frequently a gene is detected across multiple studies, the more important its role is. A literature search was performed to find data sets of genes and proteins that change after RSV infection. The data sets were standardized, collated into a single database, and then panned to determine which genes occurred in multiple data sets, generating a candidate gene list. This candidate gene list was validated by using both a clinical cohort and in vitro screening. We identified several genes that were frequently expressed following RSV infection with no assigned function in RSV control, including IFI27, IFIT3, IFI44L, GBP1, OAS3, IFI44, and IRF7. Drilling down into the function of these genes, we demonstrate a role in disease for the gene for interferon regulatory factor 7, which was highly ranked on the list, but not for IRF1, which was not. Thus, we have developed and validated an approach for collating published data sets into a manageable list of candidates, identifying novel targets for future analysis. IMPORTANCE Making the most of “big data” is one of the core challenges of current biology. There is a large array of heterogeneous data sets of host gene responses to infection, but these data sets do not inform us about gene function and require specialized skill sets and training for their utilization. Here we describe an approach that combines and simplifies these data sets, distilling this information into a single list of genes commonly upregulated in response to infection with RSV as a model pathogen. Many of the genes on the list have unknown functions in RSV disease. We validated the gene list with new clinical, in vitro, and in vivo data. This approach allows the rapid selection of genes of interest for further, more-detailed studies, thus reducing time and costs. Furthermore, the approach is simple to use and widely applicable to a range of diseases. PMID:27822537
Autonomous onboard crew operations: A review and developmental approach
NASA Technical Reports Server (NTRS)
Rogers, J. G.
1982-01-01
A review of the literature generated by an intercenter mission approach and consolidation team and their contractors was performed to obtain background information on the development of autonomous operations concepts for future space shuttle and space platform missions. The Boeing 757/767 flight management system was examined to determine the relevance for transfer of the developmental approach and technology to the performance of the crew operations function. In specific, the engine indications and crew alerting system was studied to determine the relevance of this display for the performance of crew operations onboard the vehicle. It was concluded that the developmental approach and technology utilized in the aeronautics industry would be appropriate for development of an autonomous operations concept for the space platform.
Moving Beyond ERP Components: A Selective Review of Approaches to Integrate EEG and Behavior
Bridwell, David A.; Cavanagh, James F.; Collins, Anne G. E.; Nunez, Michael D.; Srinivasan, Ramesh; Stober, Sebastian; Calhoun, Vince D.
2018-01-01
Relationships between neuroimaging measures and behavior provide important clues about brain function and cognition in healthy and clinical populations. While electroencephalography (EEG) provides a portable, low cost measure of brain dynamics, it has been somewhat underrepresented in the emerging field of model-based inference. We seek to address this gap in this article by highlighting the utility of linking EEG and behavior, with an emphasis on approaches for EEG analysis that move beyond focusing on peaks or “components” derived from averaging EEG responses across trials and subjects (generating the event-related potential, ERP). First, we review methods for deriving features from EEG in order to enhance the signal within single-trials. These methods include filtering based on user-defined features (i.e., frequency decomposition, time-frequency decomposition), filtering based on data-driven properties (i.e., blind source separation, BSS), and generating more abstract representations of data (e.g., using deep learning). We then review cognitive models which extract latent variables from experimental tasks, including the drift diffusion model (DDM) and reinforcement learning (RL) approaches. Next, we discuss ways to access associations among these measures, including statistical models, data-driven joint models and cognitive joint modeling using hierarchical Bayesian models (HBMs). We think that these methodological tools are likely to contribute to theoretical advancements, and will help inform our understandings of brain dynamics that contribute to moment-to-moment cognitive function. PMID:29632480
An Open Source Simulation Model for Soil and Sediment Bioturbation
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997
An open source simulation model for soil and sediment bioturbation.
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.
An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.
Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying
2013-03-08
Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.
Next generation of psychiatrists: What is needed in training?
Bernstein, Carol A; Bhugra, Dinesh
2011-06-01
Populations can be divided into generations. Each generation has its own characteristics and even though not every member of the same generation will share characteristics with other members of that generation, it is possible to identify generational differences. Generations frequently have different values and varying styles of functioning and learning. Since the Second World War, the generations can be divided into four cohorts: the Veterans, the Baby Boomers, Generation X, and the Millennials. Each generation has a collective identity and, in addition to understanding cultural and ethnic differences, these generational differences should also be taken into account in the teaching arena. Values and beliefs about work-life balance, learning styles, comfort with technology, methods of communication, and approaches to leadership are the types of parameters which vary across generations. As a result, medical educators would benefit from appreciating these differences in order to enhance the learning of medical students and residents and to better prepare them for delivering patient care in the twenty-first century. In this paper, the authors highlight some of the challenges and issues related to these generational divides. Copyright © 2011 Elsevier B.V. All rights reserved.
Ketelaar, Sarah M.; Nieuwenhuijsen, Karen; Bolier, Linda; Smeets, Odile; Sluiter, Judith K.
2014-01-01
Background Mental health complaints are quite common in health care employees and can have adverse effects on work functioning. The aim of this study was to evaluate an e-mental health (EMH) approach to workers' health surveillance (WHS) for nurses and allied health professionals. Using the waiting-list group of a previous randomized controlled trial with high dropout and low compliance to the intervention, we studied the pre- and posteffects of the EMH approach in a larger group of participants. Methods We applied a pretest–posttest study design. The WHS consisted of online screening on impaired work functioning and mental health followed by online automatically generated personalized feedback, online tailored advice, and access to self-help EMH interventions. The effects on work functioning, stress, and work-related fatigue after 3 months were analyzed using paired t tests and effect sizes. Results One hundred and twenty-eight nurses and allied health professionals participated at pretest as well as posttest. Significant improvements were found on work functioning (p = 0.01) and work-related fatigue (p < 0.01). Work functioning had relevantly improved in 30% of participants. A small meaningful effect on stress was found (Cohen d = .23) in the participants who had logged onto an EMH intervention (20%, n = 26). Conclusion The EMH approach to WHS improves the work functioning and mental health of nurses and allied health professionals. However, because we found small effects and participation in the offered EMH interventions was low, there is ample room for improvement. PMID:25516815
Direct determination approach for the multifractal detrending moving average analysis
NASA Astrophysics Data System (ADS)
Xu, Hai-Chuan; Gu, Gao-Feng; Zhou, Wei-Xing
2017-11-01
In the canonical framework, we propose an alternative approach for the multifractal analysis based on the detrending moving average method (MF-DMA). We define a canonical measure such that the multifractal mass exponent τ (q ) is related to the partition function and the multifractal spectrum f (α ) can be directly determined. The performances of the direct determination approach and the traditional approach of the MF-DMA are compared based on three synthetic multifractal and monofractal measures generated from the one-dimensional p -model, the two-dimensional p -model, and the fractional Brownian motions. We find that both approaches have comparable performances to unveil the fractal and multifractal nature. In other words, without loss of accuracy, the multifractal spectrum f (α ) can be directly determined using the new approach with less computation cost. We also apply the new MF-DMA approach to the volatility time series of stock prices and confirm the presence of multifractality.
Fluctuating observation time ensembles in the thermodynamics of trajectories
NASA Astrophysics Data System (ADS)
Budini, Adrián A.; Turner, Robert M.; Garrahan, Juan P.
2014-03-01
The dynamics of stochastic systems, both classical and quantum, can be studied by analysing the statistical properties of dynamical trajectories. The properties of ensembles of such trajectories for long, but fixed, times are described by large-deviation (LD) rate functions. These LD functions play the role of dynamical free energies: they are cumulant generating functions for time-integrated observables, and their analytic structure encodes dynamical phase behaviour. This ‘thermodynamics of trajectories’ approach is to trajectories and dynamics what the equilibrium ensemble method of statistical mechanics is to configurations and statics. Here we show that, just like in the static case, there are a variety of alternative ensembles of trajectories, each defined by their global constraints, with that of trajectories of fixed total time being just one of these. We show how the LD functions that describe an ensemble of trajectories where some time-extensive quantity is constant (and large) but where total observation time fluctuates can be mapped to those of the fixed-time ensemble. We discuss how the correspondence between generalized ensembles can be exploited in path sampling schemes for generating rare dynamical trajectories.
Receiver functions from west Antarctica; crust and mantle properties from POLENET
NASA Astrophysics Data System (ADS)
Aster, R. C.; Chaput, J. A.; Hansen, S. E.; Nyblade, A.; Wiens, D. A.; Huerta, A. D.; Wilson, T. J.; Anandakrishnan, S.
2011-12-01
We use receiver functions to extract crustal thickness and mantle transition zone depths across a wide extent of West Antarctica and the Transantarctic mountains using POLENET data, including recently recovered data from a 14-station West Antarctic Rift Zone transect. An adaptive approach for generating and analyzing P-receiver functions over ice sheets and sedimentary basins (similar to Winberry and Anandakrishnan, 2004) is applied using an extended time multitaper deconvolution algorithm and forward modeling synthetic seismogram fitting. We model P-S receiver functions via a layer stripping methodology (beginning with the ice sheet, if present), and fit increasingly longer sections of synthetic receiver functions to model the multiples observed in the data derived receiver functions. We additionally calculate S-P receiver functions, which provide complementary structural constraints, to generate consistent common conversion point stacks to image crustal and upper mantle discontinuities under West Antarctica. Crust throughout West Antarctica is generally thin (23-29 km; comparable to the U.S. Basin and Range) with relative thickening under the Marie Byrd Land volcanic province (to 32 km) and the Transantarctic Mountains. All constrained west Antarctic crust is substantially thicker than that in the vicinity of Ross Island, where crust as thin as 17 km is inferred in the Terror Rift region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, Timothy A.; Chevalier, Aaron; Song, Yifan
2012-06-19
We show that comprehensive sequence-function maps obtained by deep sequencing can be used to reprogram interaction specificity and to leapfrog over bottlenecks in affinity maturation by combining many individually small contributions not detectable in conventional approaches. We use this approach to optimize two computationally designed inhibitors against H1N1 influenza hemagglutinin and, in both cases, obtain variants with subnanomolar binding affinity. The most potent of these, a 51-residue protein, is broadly cross-reactive against all influenza group 1 hemagglutinins, including human H2, and neutralizes H1N1 viruses with a potency that rivals that of several human monoclonal antibodies, demonstrating that computational design followedmore » by comprehensive energy landscape mapping can generate proteins with potential therapeutic utility.« less
Dissipative quantum hydrodynamics model of x-ray Thomson scattering in dense plasmas
NASA Astrophysics Data System (ADS)
Diaw, Abdourahmane; Murillo, Michael
2017-10-01
X-ray Thomson scattering (XRTS) provides detailed diagnostic information about dense plasma experiments. The inferences made rely on an accurate model for the form factor, which is typically expressed in terms of a well-known response function. Here, we develop an alternate approach based on quantum hydrodynamics using a viscous form of dynamical density functional theory. This approach is shown to include the equation of state self-consistently, including sum rules, as well as irreversibility arising from collisions. This framework is used to generate a model for the scattering spectrum, and it offers an avenue for measuring hydrodynamic properties, such as transport coefficients, using XRTS. This work was supported by the Air Force Office of Scientific Research (Grant No. FA9550-12-1-0344).
Reconstructing biochemical pathways from time course data.
Srividhya, Jeyaraman; Crampin, Edmund J; McSharry, Patrick E; Schnell, Santiago
2007-03-01
Time series data on biochemical reactions reveal transient behavior, away from chemical equilibrium, and contain information on the dynamic interactions among reacting components. However, this information can be difficult to extract using conventional analysis techniques. We present a new method to infer biochemical pathway mechanisms from time course data using a global nonlinear modeling technique to identify the elementary reaction steps which constitute the pathway. The method involves the generation of a complete dictionary of polynomial basis functions based on the law of mass action. Using these basis functions, there are two approaches to model construction, namely the general to specific and the specific to general approach. We demonstrate that our new methodology reconstructs the chemical reaction steps and connectivity of the glycolytic pathway of Lactococcus lactis from time course experimental data.
The cell cycle as a brake for β-cell regeneration from embryonic stem cells.
El-Badawy, Ahmed; El-Badri, Nagwa
2016-01-13
The generation of insulin-producing β cells from stem cells in vitro provides a promising source of cells for cell transplantation therapy in diabetes. However, insulin-producing cells generated from human stem cells show deficiency in many functional characteristics compared with pancreatic β cells. Recent reports have shown molecular ties between the cell cycle and the differentiation mechanism of embryonic stem (ES) cells, assuming that cell fate decisions are controlled by the cell cycle machinery. Both β cells and ES cells possess unique cell cycle machinery yet with significant contrasts. In this review, we compare the cell cycle control mechanisms in both ES cells and β cells, and highlight the fundamental differences between pluripotent cells of embryonic origin and differentiated β cells. Through critical analysis of the differences of the cell cycle between these two cell types, we propose that the cell cycle of ES cells may act as a brake for β-cell regeneration. Based on these differences, we discuss the potential of modulating the cell cycle of ES cells for the large-scale generation of functionally mature β cells in vitro. Further understanding of the factors that modulate the ES cell cycle will lead to new approaches to enhance the production of functional mature insulin-producing cells, and yield a reliable system to generate bona fide β cells in vitro.
Model-Driven Approach for Body Area Network Application Development.
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-05-12
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.
Model-Driven Approach for Body Area Network Application Development
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-01-01
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394
TRIENNIAL LACTATION SYMPOSIUM: Nutrigenomics in livestock: Systems biology meets nutrition.
Loor, J J; Vailati-Riboni, M; McCann, J C; Zhou, Z; Bionaz, M
2015-12-01
The advent of high-throughput technologies to study an animal's genome, proteome, and metabolome (i.e., "omics" tools) constituted a setback to the use of reductionism in livestock research. More recent development of "next-generation sequencing" tools was instrumental in allowing in-depth studies of the microbiome in the rumen and other sections of the gastrointestinal tract. Omics, along with bioinformatics, constitutes the foundation of modern systems biology, a field of study widely used in model organisms (e.g., rodents, yeast, humans) to enhance understanding of the complex biological interactions occurring within cells and tissues at the gene, protein, and metabolite level. Application of systems biology concepts is ideal for the study of interactions between nutrition and physiological state with tissue and cell metabolism and function during key life stages of livestock species, including the transition from pregnancy to lactation, in utero development, or postnatal growth. Modern bioinformatic tools capable of discerning functional outcomes and biologically meaningful networks complement the ever-increasing ability to generate large molecular, microbial, and metabolite data sets. Simultaneous visualization of the complex intertissue adaptations to physiological state and nutrition can now be discerned. Studies to understand the linkages between the microbiome and the absorptive epithelium using the integrative approach are emerging. We present examples of new knowledge generated through the application of functional analyses of transcriptomic, proteomic, and metabolomic data sets encompassing nutritional management of dairy cows, pigs, and poultry. Published work to date underscores that the integrative approach across and within tissues may prove useful for fine-tuning nutritional management of livestock. An important goal during this process is to uncover key molecular players involved in the organismal adaptations to nutrition.
Kijlstra, Jan David; Hu, Dongjian; van der Meer, Peter; Domian, Ibrahim J
2017-11-15
Human pluripotent stem-cell derived cardiomyocytes (hPSC-CMs) hold great promise for applications in human disease modeling, drug discovery, cardiotoxicity screening, and, ultimately, regenerative medicine. The ability to study multiple parameters of hPSC-CM function, such as contractile and electrical activity, calcium cycling, and force generation, is therefore of paramount importance. hPSC-CMs cultured on stiff substrates like glass or polystyrene do not have the ability to shorten during contraction, making them less suitable for the study of hPSC-CM contractile function. Other approaches require highly specialized hardware and are difficult to reproduce. Here we describe a protocol for the preparation of hPSC-CMs on soft substrates that enable shortening, and subsequently the simultaneous quantitative analysis of their contractile and electrical activity, calcium cycling, and force generation at single-cell resolution. This protocol requires only affordable and readily available materials and works with standard imaging hardware. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Functional cortical neurons and astrocytes from human pluripotent stem cells in 3D culture.
Paşca, Anca M; Sloan, Steven A; Clarke, Laura E; Tian, Yuan; Makinson, Christopher D; Huber, Nina; Kim, Chul Hoon; Park, Jin-Young; O'Rourke, Nancy A; Nguyen, Khoa D; Smith, Stephen J; Huguenard, John R; Geschwind, Daniel H; Barres, Ben A; Paşca, Sergiu P
2015-07-01
The human cerebral cortex develops through an elaborate succession of cellular events that, when disrupted, can lead to neuropsychiatric disease. The ability to reprogram somatic cells into pluripotent cells that can be differentiated in vitro provides a unique opportunity to study normal and abnormal corticogenesis. Here, we present a simple and reproducible 3D culture approach for generating a laminated cerebral cortex-like structure, named human cortical spheroids (hCSs), from pluripotent stem cells. hCSs contain neurons from both deep and superficial cortical layers and map transcriptionally to in vivo fetal development. These neurons are electrophysiologically mature, display spontaneous activity, are surrounded by nonreactive astrocytes and form functional synapses. Experiments in acute hCS slices demonstrate that cortical neurons participate in network activity and produce complex synaptic events. These 3D cultures should allow a detailed interrogation of human cortical development, function and disease, and may prove a versatile platform for generating other neuronal and glial subtypes in vitro.
A Study on the Secure User Profiling Structure and Procedure for Home Healthcare Systems.
Ko, Hoon; Song, MoonBae
2016-01-01
Despite of various benefits such as a convenience and efficiency, home healthcare systems have some inherent security risks that may cause a serious leak on personal health information. This work presents a Secure User Profiling Structure which has the patient information including their health information. A patient and a hospital keep it at that same time, they share the updated data. While they share the data and communicate, the data can be leaked. To solve the security problems, a secure communication channel with a hash function and an One-Time Password between a client and a hospital should be established and to generate an input value to an OTP, it uses a dual hash-function. This work presents a dual hash function-based approach to generate the One-Time Password ensuring a secure communication channel with the secured key. In result, attackers are unable to decrypt the leaked information because of the secured key; in addition, the proposed method outperforms the existing methods in terms of computation cost.
Performance of high-recovery recycling reverse osmosis with wash water
NASA Technical Reports Server (NTRS)
Herrmann, Cal C.
1993-01-01
Inclusion of a recycling loop for partially-desalted water from second-stage reverse-osmosis permeate has been shown useful for achieving high-recovery at moderate applied pressures. This approach has now been applied to simulated wash waters, to obtain data on retention by the membranes of solutes in a mixture comparable to anticipated spacecraft hygiene wastewaters, and to generate an estimate of the maximum concentration that can be expected without causing membrane fouling. A first experiment set provides selectivity information from a single membrane and an Igepon detergent, as a function of final concentration. A reject concentration of 3.1% Total Organic Carbon has been reached, at a pressure of 1.4 Mega Pascals, without membrane fouling. Further experiments have generated selectivity values for the recycle configuration from two washwater simulations, as a function of applied pump pressure. Reverse osmosis removal has also been tested for washwater containing detergent formulated for plant growth compatibility (containing nitrogen, phosphorous and potassium functional groups.)
Hutchins, James R. A.
2014-01-01
The genomic era has enabled research projects that use approaches including genome-scale screens, microarray analysis, next-generation sequencing, and mass spectrometry–based proteomics to discover genes and proteins involved in biological processes. Such methods generate data sets of gene, transcript, or protein hits that researchers wish to explore to understand their properties and functions and thus their possible roles in biological systems of interest. Recent years have seen a profusion of Internet-based resources to aid this process. This review takes the viewpoint of the curious biologist wishing to explore the properties of protein-coding genes and their products, identified using genome-based technologies. Ten key questions are asked about each hit, addressing functions, phenotypes, expression, evolutionary conservation, disease association, protein structure, interactors, posttranslational modifications, and inhibitors. Answers are provided by presenting the latest publicly available resources, together with methods for hit-specific and data set–wide information retrieval, suited to any genome-based analytical technique and experimental species. The utility of these resources is demonstrated for 20 factors regulating cell proliferation. Results obtained using some of these are discussed in more depth using the p53 tumor suppressor as an example. This flexible and universally applicable approach for characterizing experimental hits helps researchers to maximize the potential of their projects for biological discovery. PMID:24723265
Isolation of Novel CreERT2-Driver Lines in Zebrafish Using an Unbiased Gene Trap Approach
Jungke, Peggy; Hammer, Juliane; Hans, Stefan; Brand, Michael
2015-01-01
Gene manipulation using the Cre/loxP-recombinase system has been successfully employed in zebrafish to study gene functions and lineage relationships. Recently, gene trapping approaches have been applied to produce large collections of transgenic fish expressing conditional alleles in various tissues. However, the limited number of available cell- and tissue-specific Cre/CreERT2-driver lines still constrains widespread application in this model organism. To enlarge the pool of existing CreERT2-driver lines, we performed a genome-wide gene trap screen using a Tol2-based mCherry-T2a-CreERT2 (mCT2aC) gene trap vector. This cassette consists of a splice acceptor and a mCherry-tagged variant of CreERT2 which enables simultaneous labeling of the trapping event, as well as CreERT2 expression from the endogenous promoter. Using this strategy, we generated 27 novel functional CreERT2-driver lines expressing in a cell- and tissue-specific manner during development and adulthood. This study summarizes the analysis of the generated CreERT2-driver lines with respect to functionality, expression, integration, as well as associated phenotypes. Our results significantly enlarge the existing pool of CreERT2-driver lines in zebrafish and combined with Cre–dependent effector lines, the new CreERT2-driver lines will be important tools to manipulate the zebrafish genome. PMID:26083735
Nanosized Building Blocks for Customizing Novel Antibiofilm Approaches
Paula, A.J.; Koo, H.
2016-01-01
Recent advances in nanotechnology provide unparalleled flexibility to control the composition, size, shape, surface chemistry, and functionality of materials. Currently available engineering approaches allow precise synthesis of nanocompounds (e.g., nanoparticles, nanostructures, nanocrystals) with both top-down and bottom-up design principles at the submicron level. In this context, these “nanoelements” (NEs) or “nanosized building blocks” can 1) generate new nanocomposites with antibiofilm properties or 2) be used to coat existing surfaces (e.g., teeth) and exogenously introduced surfaces (e.g., restorative or implant materials) for prevention of bacterial adhesion and biofilm formation. Furthermore, functionalized NEs 3) can be conceived as nanoparticles to carry and selectively release antimicrobial agents after attachment or within oral biofilms, resulting in their disruption. The latter mechanism includes “smart release” of agents when triggered by pathogenic microenvironments (e.g., acidic pH or low oxygen levels) for localized and controlled drug delivery to simultaneously kill bacteria and dismantle the biofilm matrix. Here we discuss inorganic, metallic, polymeric, and carbon-based NEs for their outstanding chemical flexibility, stability, and antibiofilm properties manifested when converted into bioactive materials, assembled on-site or delivered at biofilm-surface interfaces. Details are provided on the emerging concept of the rational design of NEs and recent technological breakthroughs for the development of a new generation of nanocoatings or functional nanoparticles for biofilm control in the oral cavity. PMID:27856967