Scaffolding a Complex Task of Experimental Design in Chemistry with a Computer Environment
ERIC Educational Resources Information Center
Girault, Isabelle; d'Ham, Cédric
2014-01-01
When solving a scientific problem through experimentation, students may have the responsibility to design the experiment. When students work in a conventional condition, with paper and pencil, the designed procedures stay at a very general level. There is a need for additional scaffolds to help the students perform this complex task. We propose a…
Lin, Chenxi; Martínez, Luis Javier; Povinelli, Michelle L
2013-09-09
We design silicon membranes with nanohole structures with optimized complex unit cells that maximize broadband absorption. We fabricate the optimized design and measure the optical absorption. We demonstrate an experimental broadband absorption about 3.5 times higher than an equally-thick thin film.
Near-optimal experimental design for model selection in systems biology.
Busetto, Alberto Giovanni; Hauser, Alain; Krummenacher, Gabriel; Sunnåker, Mikael; Dimopoulos, Sotiris; Ong, Cheng Soon; Stelling, Jörg; Buhmann, Joachim M
2013-10-15
Biological systems are understood through iterations of modeling and experimentation. Not all experiments, however, are equally valuable for predictive modeling. This study introduces an efficient method for experimental design aimed at selecting dynamical models from data. Motivated by biological applications, the method enables the design of crucial experiments: it determines a highly informative selection of measurement readouts and time points. We demonstrate formal guarantees of design efficiency on the basis of previous results. By reducing our task to the setting of graphical models, we prove that the method finds a near-optimal design selection with a polynomial number of evaluations. Moreover, the method exhibits the best polynomial-complexity constant approximation factor, unless P = NP. We measure the performance of the method in comparison with established alternatives, such as ensemble non-centrality, on example models of different complexity. Efficient design accelerates the loop between modeling and experimentation: it enables the inference of complex mechanisms, such as those controlling central metabolic operation. Toolbox 'NearOED' available with source code under GPL on the Machine Learning Open Source Software Web site (mloss.org).
NASA Astrophysics Data System (ADS)
Norinder, Ulf
1990-12-01
An experimental design based 3-D QSAR analysis using a combination of principal component and PLS analysis is presented and applied to human corticosteroid-binding globulin complexes. The predictive capability of the created model is good. The technique can also be used as guidance when selecting new compounds to be investigated.
NASA Astrophysics Data System (ADS)
Thubagere, Anupama J.; Thachuk, Chris; Berleant, Joseph; Johnson, Robert F.; Ardelean, Diana A.; Cherry, Kevin M.; Qian, Lulu
2017-02-01
Biochemical circuits made of rationally designed DNA molecules are proofs of concept for embedding control within complex molecular environments. They hold promise for transforming the current technologies in chemistry, biology, medicine and material science by introducing programmable and responsive behaviour to diverse molecular systems. As the transformative power of a technology depends on its accessibility, two main challenges are an automated design process and simple experimental procedures. Here we demonstrate the use of circuit design software, combined with the use of unpurified strands and simplified experimental procedures, for creating a complex DNA strand displacement circuit that consists of 78 distinct species. We develop a systematic procedure for overcoming the challenges involved in using unpurified DNA strands. We also develop a model that takes synthesis errors into consideration and semi-quantitatively reproduces the experimental data. Our methods now enable even novice researchers to successfully design and construct complex DNA strand displacement circuits.
2016-03-18
SPONSORED REPORT SERIES Understanding Complexity and Self - Organization in a Defense Program Management Organization (Experimental Design...experiment will examine the decision-making process within the program office and the self - organization of key program office personnel based upon formal...and informal communications links. Additionally, we are interested in the effects of this self - organizing process on the organization’s shared
Vinnakota, Kalyan C; Beard, Daniel A; Dash, Ranjan K
2009-01-01
Identification of a complex biochemical system model requires appropriate experimental data. Models constructed on the basis of data from the literature often contain parameters that are not identifiable with high sensitivity and therefore require additional experimental data to identify those parameters. Here we report the application of a local sensitivity analysis to design experiments that will improve the identifiability of previously unidentifiable model parameters in a model of mitochondrial oxidative phosphorylation and tricaboxylic acid cycle. Experiments were designed based on measurable biochemical reactants in a dilute suspension of purified cardiac mitochondria with experimentally feasible perturbations to this system. Experimental perturbations and variables yielding the most number of parameters above a 5% sensitivity level are presented and discussed.
Casey, F P; Baird, D; Feng, Q; Gutenkunst, R N; Waterfall, J J; Myers, C R; Brown, K S; Cerione, R A; Sethna, J P
2007-05-01
We apply the methods of optimal experimental design to a differential equation model for epidermal growth factor receptor signalling, trafficking and down-regulation. The model incorporates the role of a recently discovered protein complex made up of the E3 ubiquitin ligase, Cbl, the guanine exchange factor (GEF), Cool-1 (beta -Pix) and the Rho family G protein Cdc42. The complex has been suggested to be important in disrupting receptor down-regulation. We demonstrate that the model interactions can accurately reproduce the experimental observations, that they can be used to make predictions with accompanying uncertainties, and that we can apply ideas of optimal experimental design to suggest new experiments that reduce the uncertainty on unmeasurable components of the system.
Investigations of a Complex, Realistic Task: Intentional, Unsystematic, and Exhaustive Experimenters
ERIC Educational Resources Information Center
McElhaney, Kevin W.; Linn, Marcia C.
2011-01-01
This study examines how students' experimentation with a virtual environment contributes to their understanding of a complex, realistic inquiry problem. We designed a week-long, technology-enhanced inquiry unit on car collisions. The unit uses new technologies to log students' experimentation choices. Physics students (n = 148) in six diverse high…
Assessing the Effectiveness of a Computer Simulation for Teaching Ecological Experimental Design
ERIC Educational Resources Information Center
Stafford, Richard; Goodenough, Anne E.; Davies, Mark S.
2010-01-01
Designing manipulative ecological experiments is a complex and time-consuming process that is problematic to teach in traditional undergraduate classes. This study investigates the effectiveness of using a computer simulation--the Virtual Rocky Shore (VRS)--to facilitate rapid, student-centred learning of experimental design. We gave a series of…
ERIC Educational Resources Information Center
Smith, Rachel A.; Levine, Timothy R.; Lachlan, Kenneth A.; Fediuk, Thomas A.
2002-01-01
Notes that the availability of statistical software packages has led to a sharp increase in use of complex research designs and complex statistical analyses in communication research. Reports a series of Monte Carlo simulations which demonstrate that this complexity may come at a heavier cost than many communication researchers realize. Warns…
Flexible Space-Filling Designs for Complex System Simulations
2013-06-01
interior of the experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with...Computer Experiments, Design of Experiments, Genetic Algorithm , Latin Hypercube, Response Surface Methodology, Nearly Orthogonal 15. NUMBER OF PAGES 147...experimental region and cannot fit higher-order models. We present a genetic algorithm that constructs space-filling designs with minimal correlations
Lazard, Allison; Mackert, Michael
2014-10-01
This paper highlights the influential role of design complexity for users' first impressions of health websites. An experimental design was utilized to investigate whether a website's level of design complexity impacts user evaluations. An online questionnaire measured the hypothesized impact of design complexity on predictors of message effectiveness. Findings reveal that increased design complexity was positively associated with higher levels of perceived design esthetics, attitude toward the website, perceived message comprehensibility, perceived ease of use, perceived usefulness, perceived message quality, perceived informativeness, and perceived visual informativeness. This research gives further evidence that design complexity should be considered an influential variable for health communicators to effectively reach their audiences, as it embodies the critical first step for message evaluation via electronic platforms. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
1983-02-01
blow-off stability and fractional conversion was evaluated for design of an experimental study of these phenomena. The apparatus designed will be...the development of an array of experimental methods and test strategies designed to unravel a complex process that is very difficult to observe directly...this effort of lead field theoretic analysis as a design basis has made that possible. The experimental phase of the effort has three major
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flathers, M.B.; Bache, G.E.; Rainsberger, R.
1996-04-01
The flow field of a complex three-dimensional radial inlet for an industrial pipeline centrifugal compressor has been experimentally determined on a half-scale model. Based on the experimental results, inlet guide vanes have been designed to correct pressure and swirl angle distribution deficiencies. The unvaned and vaned inlets are analyzed with a commercially available fully three-dimensional viscous Navier-Stokes code. Since experimental results were available prior to the numerical study, the unvaned analysis is considered a postdiction while the vaned analysis is considered a prediction. The computational results of the unvaned inlet have been compared to the previously obtained experimental results. Themore » experimental method utilized for the unvaned inlet is repeated for the vaned inlet and the data have been used to verify the computational results. The paper will discuss experimental, design, and computational procedures, grid generation, boundary conditions, and experimental versus computational methods. Agreement between experimental and computational results is very good, both in prediction and postdiction modes. The results of this investigation indicate that CFD offers a measurable advantage in design, schedule, and cost and can be applied to complex, three-dimensional radial inlets.« less
Dingus, Cheryl A; Teuschler, Linda K; Rice, Glenn E; Simmons, Jane Ellen; Narotsky, Michael G
2011-10-01
In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA's Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss.
Dingus, Cheryl A.; Teuschler, Linda K.; Rice, Glenn E.; Simmons, Jane Ellen; Narotsky, Michael G.
2011-01-01
In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA’s Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss. PMID:22073030
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
The Problem of Size in Robust Design
NASA Technical Reports Server (NTRS)
Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri
1997-01-01
To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems.
White, Andrew; Tolman, Malachi; Thames, Howard D; Withers, Hubert Rodney; Mason, Kathy A; Transtrum, Mark K
2016-12-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model's discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system-a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model.
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems
Tolman, Malachi; Thames, Howard D.; Mason, Kathy A.
2016-01-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model’s discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system–a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model. PMID:27923060
Patterning nonisometric origami in nematic elastomer sheets
NASA Astrophysics Data System (ADS)
Plucinsky, Paul; Kowalski, Benjamin A.; White, Timothy J.; Bhattacharya, Kaushik
Nematic elastomers dramatically change their shape in response to diverse stimuli including light and heat. In this paper, we provide a systematic framework for the design of complex three dimensional shapes through the actuation of heterogeneously patterned nematic elastomer sheets. These sheets are composed of \\textit{nonisometric origami} building blocks which, when appropriately linked together, can actuate into a diverse array of three dimensional faceted shapes. We demonstrate both theoretically and experimentally that: 1) the nonisometric origami building blocks actuate in the predicted manner, 2) the integration of multiple building blocks leads to complex multi-stable, yet predictable, shapes, 3) we can bias the actuation experimentally to obtain a desired complex shape amongst the multi-stable shapes. We then show that this experimentally realized functionality enables a rich possible design landscape for actuation using nematic elastomers. We highlight this landscape through theoretical examples, which utilize large arrays of these building blocks to realize a desired three dimensional origami shape. In combination, these results amount to an engineering design principle, which we hope will provide a template for the application of nematic elastomers to emerging technologies.
NASA Astrophysics Data System (ADS)
Zheng, Jigui; Huang, Yuping; Wu, Hongxing; Zheng, Ping
2016-07-01
Transverse-flux with high efficiency has been applied in Stirling engine and permanent magnet synchronous linear generator system, however it is restricted for large application because of low and complex process. A novel type of cylindrical, non-overlapping, transverse-flux, and permanent-magnet linear motor(TFPLM) is investigated, furthermore, a high power factor and less process complexity structure research is developed. The impact of magnetic leakage factor on power factor is discussed, by using the Finite Element Analysis(FEA) model of stirling engine and TFPLM, an optimization method for electro-magnetic design of TFPLM is proposed based on magnetic leakage factor. The relation between power factor and structure parameter is investigated, and a structure parameter optimization method is proposed taking power factor maximum as a goal. At last, the test bench is founded, starting experimental and generating experimental are performed, and a good agreement of simulation and experimental is achieved. The power factor is improved and the process complexity is decreased. This research provides the instruction to design high-power factor permanent-magnet linear generator.
Assessing Consequential Scenarios in a Complex Operational Environment Using Agent Based Simulation
2017-03-16
RWISE) 93 5.1.5 Conflict Modeling, Planning, and Outcomes Experimentation Program (COMPOEX) 94 5.1.6 Joint Non -Kinetic Effects Model (JNEM)/Athena... experimental design and testing. 4.3.8 Types and Attributes of Agent-Based Model Design Patterns Using the aforementioned ABM flowchart design methodology ...speed, or flexibility during tactical US Army wargaming. The report considers methodologies to improve analysis of the human domain, identifies
Experimental Design and Power Calculation for RNA-seq Experiments.
Wu, Zhijin; Wu, Hao
2016-01-01
Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.
Tian, Xin; Li, Zengyuan; Chen, Erxue; Liu, Qinhuo; Yan, Guangjian; Wang, Jindi; Niu, Zheng; Zhao, Shaojie; Li, Xin; Pang, Yong; Su, Zhongbo; van der Tol, Christiaan; Liu, Qingwang; Wu, Chaoyang; Xiao, Qing; Yang, Le; Mu, Xihan; Bo, Yanchen; Qu, Yonghua; Zhou, Hongmin; Gao, Shuai; Chai, Linna; Huang, Huaguo; Fan, Wenjie; Li, Shihua; Bai, Junhua; Jiang, Lingmei; Zhou, Ji
2015-01-01
The Complicate Observations and Multi-Parameter Land Information Constructions on Allied Telemetry Experiment (COMPLICATE) comprises a network of remote sensing experiments designed to enhance the dynamic analysis and modeling of remotely sensed information for complex land surfaces. Two types of experimental campaigns were established under the framework of COMPLICATE. The first was designed for continuous and elaborate experiments. The experimental strategy helps enhance our understanding of the radiative and scattering mechanisms of soil and vegetation and modeling of remotely sensed information for complex land surfaces. To validate the methodologies and models for dynamic analyses of remote sensing for complex land surfaces, the second campaign consisted of simultaneous satellite-borne, airborne, and ground-based experiments. During field campaigns, several continuous and intensive observations were obtained. Measurements were undertaken to answer key scientific issues, as follows: 1) Determine the characteristics of spatial heterogeneity and the radiative and scattering mechanisms of remote sensing on complex land surfaces. 2) Determine the mechanisms of spatial and temporal scale extensions for remote sensing on complex land surfaces. 3) Determine synergist inversion mechanisms for soil and vegetation parameters using multi-mode remote sensing on complex land surfaces. Here, we introduce the background, the objectives, the experimental designs, the observations and measurements, and the overall advances of COMPLICATE. As a result of the implementation of COMLICATE and for the next several years, we expect to contribute to quantitative remote sensing science and Earth observation techniques. PMID:26332035
Bio-inspired computational design of iron catalysts for the hydrogenation of carbon dioxide.
Yang, Xinzheng
2015-08-25
Inspired by the active site structure of monoiron hydrogenase, a series of iron complexes are built using experimentally ready-made acylmethylpyridinol and aliphatic PNP pincer ligands. Density functional theory calculations indicate that the newly designed iron complexes are very promising to catalyze the formation of formic acid from H2 and CO2.
Subband Image Coding with Jointly Optimized Quantizers
NASA Technical Reports Server (NTRS)
Kossentini, Faouzi; Chung, Wilson C.; Smith Mark J. T.
1995-01-01
An iterative design algorithm for the joint design of complexity- and entropy-constrained subband quantizers and associated entropy coders is proposed. Unlike conventional subband design algorithms, the proposed algorithm does not require the use of various bit allocation algorithms. Multistage residual quantizers are employed here because they provide greater control of the complexity-performance tradeoffs, and also because they allow efficient and effective high-order statistical modeling. The resulting subband coder exploits statistical dependencies within subbands, across subbands, and across stages, mainly through complexity-constrained high-order entropy coding. Experimental results demonstrate that the complexity-rate-distortion performance of the new subband coder is exceptional.
Using Video Modeling to Teach Complex Social Sequences to Children with Autism
ERIC Educational Resources Information Center
Nikopoulos, Christos K.; Keenan, Mickey
2007-01-01
This study comprised of two experiments was designed to teach complex social sequences to children with autism. Experimental control was achieved by collecting data using means of within-system design methodology. Across a number of conditions children were taken to a room to view one of the four short videos of two people engaging in a simple…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis
2015-11-13
CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.
Research designs for studies evaluating the effectiveness of change and improvement strategies.
Eccles, M; Grimshaw, J; Campbell, M; Ramsay, C
2003-02-01
The methods of evaluating change and improvement strategies are not well described. The design and conduct of a range of experimental and non-experimental quantitative designs are considered. Such study designs should usually be used in a context where they build on appropriate theoretical, qualitative and modelling work, particularly in the development of appropriate interventions. A range of experimental designs are discussed including single and multiple arm randomised controlled trials and the use of more complex factorial and block designs. The impact of randomisation at both group and individual levels and three non-experimental designs (uncontrolled before and after, controlled before and after, and time series analysis) are also considered. The design chosen will reflect both the needs (and resources) in any particular circumstances and also the purpose of the evaluation. The general principle underlying the choice of evaluative design is, however, simple-those conducting such evaluations should use the most robust design possible to minimise bias and maximise generalisability.
1986-02-01
ability level (low, medium and hilh) showed that the experimental groups performed higher than the non - experimental groups at each ibility level. The...instructional system design (19D) model. The T’. model hail the following deficiencies: inadequate methodology for preDnring, analyzing, and categorizing... experimental designs will be required in the future and the emphasis unon observation will become more complex. Rriggs, .F., & -Johnston, W.A. Lshorptory
Cankorur-Cetinkaya, Ayca; Dias, Joao M L; Kludas, Jana; Slater, Nigel K H; Rousu, Juho; Oliver, Stephen G; Dikicioglu, Duygu
2017-06-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple-to-use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257).
Zeeman, Heidi; Kendall, Elizabeth; Whitty, Jennifer A; Wright, Courtney J; Townsend, Clare; Smith, Dianne; Lakhani, Ali; Kennerley, Samantha
2016-03-15
Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability.
Computational Approaches to Nucleic Acid Origami.
Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo
2015-10-12
Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.
Inverse problems in complex material design: Applications to non-crystalline solids
NASA Astrophysics Data System (ADS)
Biswas, Parthapratim; Drabold, David; Elliott, Stephen
The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.
Using Learning Analytics to Characterize Student Experimentation Strategies in Engineering Design
ERIC Educational Resources Information Center
Vieira, Camilo; Goldstein, Molly Hathaway; Purzer, Senay; Magana, Alejandra J.
2016-01-01
Engineering design is a complex process both for students to participate in and for instructors to assess. Informed designers use the key strategy of conducting experiments as they test ideas to inform next steps. Conversely, beginning designers experiment less, often with confounding variables. These behaviours are not easy to assess in…
Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter
2015-01-01
Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227
Strategic Teaching: Student Learning through Working the Process
ERIC Educational Resources Information Center
Spanbroek, Nancy
2010-01-01
The designers of our future built environment must possess intellectual tools which will allow them to be disciplined, flexible and analytical thinkers, able to address and resolve new and complex problems. In response, an experimental and collaborative design studio was designed to inspire and build on students' knowledge and their creative…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jernigan, Dann A.; Blanchat, Thomas K.
It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Cankorur-Cetinkaya, Ayca; Dias, Joao M. L.; Kludas, Jana; Slater, Nigel K. H.; Rousu, Juho; Dikicioglu, Duygu
2017-01-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple‐to‐use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257). PMID:28635591
Pyviko: an automated Python tool to design gene knockouts in complex viruses with overlapping genes.
Taylor, Louis J; Strebel, Klaus
2017-01-07
Gene knockouts are a common tool used to study gene function in various organisms. However, designing gene knockouts is complicated in viruses, which frequently contain sequences that code for multiple overlapping genes. Designing mutants that can be traced by the creation of new or elimination of existing restriction sites further compounds the difficulty in experimental design of knockouts of overlapping genes. While software is available to rapidly identify restriction sites in a given nucleotide sequence, no existing software addresses experimental design of mutations involving multiple overlapping amino acid sequences in generating gene knockouts. Pyviko performed well on a test set of over 240,000 gene pairs collected from viral genomes deposited in the National Center for Biotechnology Information Nucleotide database, identifying a point mutation which added a premature stop codon within the first 20 codons of the target gene in 93.2% of all tested gene-overprinted gene pairs. This shows that Pyviko can be used successfully in a wide variety of contexts to facilitate the molecular cloning and study of viral overprinted genes. Pyviko is an extensible and intuitive Python tool for designing knockouts of overlapping genes. Freely available as both a Python package and a web-based interface ( http://louiejtaylor.github.io/pyViKO/ ), Pyviko simplifies the experimental design of gene knockouts in complex viruses with overlapping genes.
Chatterji, Madhabi
2016-12-01
This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Quasi experimental designs in pharmacist intervention research.
Krass, Ines
2016-06-01
Background In the field of pharmacist intervention research it is often difficult to conform to the rigorous requirements of the "true experimental" models, especially the requirement of randomization. When randomization is not feasible, a practice based researcher can choose from a range of "quasi-experimental designs" i.e., non-randomised and at time non controlled. Objective The aim of this article was to provide an overview of quasi-experimental designs, discuss their strengths and weaknesses and to investigate their application in pharmacist intervention research over the previous decade. Results In the literature quasi experimental studies may be classified into five broad categories: quasi-experimental design without control groups; quasi-experimental design that use control groups with no pre-test; quasi-experimental design that use control groups and pre-tests; interrupted time series and stepped wedge designs. Quasi-experimental study design has consistently featured in the evolution of pharmacist intervention research. The most commonly applied of all quasi experimental designs in the practice based research literature are the one group pre-post-test design and the non-equivalent control group design i.e., (untreated control group with dependent pre-tests and post-tests) and have been used to test the impact of pharmacist interventions in general medications management as well as in specific disease states. Conclusion Quasi experimental studies have a role to play as proof of concept, in the pilot phases of interventions when testing different intervention components, especially in complex interventions. They serve to develop an understanding of possible intervention effects: while in isolation they yield weak evidence of clinical efficacy, taken collectively, they help build a body of evidence in support of the value of pharmacist interventions across different practice settings and countries. However, when a traditional RCT is not feasible for logistical and/or ethical reasons researchers should endeavour to use the more robust of the quasi experimental designs.
Pant, Apourv; Rai, J P N
2018-04-15
Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.
Three-dimensional broadband omnidirectional acoustic ground cloak
NASA Astrophysics Data System (ADS)
Zigoneanu, Lucian; Popa, Bogdan-Ioan; Cummer, Steven A.
2014-04-01
The control of sound propagation and reflection has always been the goal of engineers involved in the design of acoustic systems. A recent design approach based on coordinate transformations, which is applicable to many physical systems, together with the development of a new class of engineered materials called metamaterials, has opened the road to the unconstrained control of sound. However, the ideal material parameters prescribed by this methodology are complex and challenging to obtain experimentally, even using metamaterial design approaches. Not surprisingly, experimental demonstration of devices obtained using transformation acoustics is difficult, and has been implemented only in two-dimensional configurations. Here, we demonstrate the design and experimental characterization of an almost perfect three-dimensional, broadband, and, most importantly, omnidirectional acoustic device that renders a region of space three wavelengths in diameter invisible to sound.
Didier, Caroline; Forno, Guillermina; Etcheverrigaray, Marina; Kratje, Ricardo; Goicoechea, Héctor
2009-09-21
The optimal blends of six compounds that should be present in culture media used in recombinant protein production were determined by means of artificial neural networks (ANN) coupled with crossed mixture experimental design. This combination constitutes a novel approach to develop a medium for cultivating genetically engineered mammalian cells. The compounds were collected in two mixtures of three elements each, and the experimental space was determined by a crossed mixture design. Empirical data from 51 experimental units were used in a multiresponse analysis to train artificial neural networks which satisfy different requirements, in order to define two new culture media (Medium 1 and Medium 2) to be used in a continuous biopharmaceutical production process. These media were tested in a bioreactor to produce a recombinant protein in CHO cells. Remarkably, for both predicted media all responses satisfied the predefined goals pursued during the analysis, except in the case of the specific growth rate (mu) observed for Medium 1. ANN analysis proved to be a suitable methodology to be used when dealing with complex experimental designs, as frequently occurs in the optimization of production processes in the biotechnology area. The present work is a new example of the use of ANN for the resolution of a complex, real life system, successfully employed in the context of a biopharmaceutical production process.
DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology
Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng
2015-01-01
Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
ERIC Educational Resources Information Center
Roesch, Frank; Nerb, Josef; Riess, Werner
2015-01-01
Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of "experimental problem-solving ability" better than conventional lessons in science. We used a paper-and-pencil test to assess…
Task design influences prosociality in captive chimpanzees (Pan troglodytes).
House, Bailey R; Silk, Joan B; Lambeth, Susan P; Schapiro, Steven J
2014-01-01
Chimpanzees confer benefits on group members, both in the wild and in captive populations. Experimental studies of how animals allocate resources can provide useful insights about the motivations underlying prosocial behavior, and understanding the relationship between task design and prosocial behavior provides an important foundation for future research exploring these animals' social preferences. A number of studies have been designed to assess chimpanzees' preferences for outcomes that benefit others (prosocial preferences), but these studies vary greatly in both the results obtained and the methods used, and in most cases employ procedures that reduce critical features of naturalistic social interactions, such as partner choice. The focus of the current study is on understanding the link between experimental methodology and prosocial behavior in captive chimpanzees, rather than on describing these animals' social motivations themselves. We introduce a task design that avoids isolating subjects and allows them to freely decide whether to participate in the experiment. We explore key elements of the methods utilized in previous experiments in an effort to evaluate two possibilities that have been offered to explain why different experimental designs produce different results: (a) chimpanzees are less likely to deliver food to others when they obtain food for themselves, and (b) evidence of prosociality may be obscured by more "complex" experimental apparatuses (e.g., those including more components or alternative choices). Our results suggest that the complexity of laboratory tasks may generate observed variation in prosocial behavior in laboratory experiments, and highlights the need for more naturalistic research designs while also providing one example of such a paradigm.
Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.
Huynh, Linh; Tagkopoulos, Ilias
2015-08-21
In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.
NASA Astrophysics Data System (ADS)
Tian, Shudong; Han, Jun; Yang, Jianwei; Zeng, Xiaoyang
2017-10-01
Electrocardiogram (ECG) can be used as a valid way for diagnosing heart disease. To fulfill ECG processing in wearable devices by reducing computation complexity and hardware cost, two kinds of adaptive filters are designed to perform QRS complex detection and motion artifacts removal, respectively. The proposed design achieves a sensitivity of 99.49% and a positive predictivity of 99.72%, tested under the MIT-BIH ECG database. The proposed design is synthesized under the SMIC 65-nm CMOS technology and verified by post-synthesis simulation. Experimental results show that the power consumption and area cost of this design are of 160 μW and 1.09 × 10 5 μm2, respectively. Project supported by the National Natural Science Foundation of China (Nos. 61574040, 61234002, 61525401).
NASA Astrophysics Data System (ADS)
Brown, Alexander; Eviston, Connor
2017-02-01
Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.
The Factorial Survey: Design Selection and its Impact on Reliability and Internal Validity
ERIC Educational Resources Information Center
Dülmer, Hermann
2016-01-01
The factorial survey is an experimental design consisting of varying situations (vignettes) that have to be judged by respondents. For more complex research questions, it quickly becomes impossible for an individual respondent to judge all vignettes. To overcome this problem, random designs are recommended most of the time, whereas quota designs…
Learning Problem-Solving through Making Games at the Game Design and Learning Summer Program
ERIC Educational Resources Information Center
Akcaoglu, Mete
2014-01-01
Today's complex and fast-evolving world necessitates young students to possess design and problem-solving skills more than ever. One alternative method of teaching children problem-solving or thinking skills has been using computer programming, and more recently, game-design tasks. In this pre-experimental study, a group of middle school…
ERIC Educational Resources Information Center
Kanu, A. Bakarr; Pajski, Megan; Hartman, Machelle; Kimaru, Irene; Marine, Susan
2015-01-01
In today's complex world, there is a continued demand for recently graduated forensic chemists (criminalists) who have some background in forensic experimental techniques. This article describes modern forensic experimental approaches designed and implemented from a unique instructional perspective to present certain facets of crime scene…
Studying technology use as social practice: the untapped potential of ethnography
2011-01-01
Information and communications technologies (ICTs) in healthcare are often introduced with expectations of higher-quality, more efficient, and safer care. Many fail to meet these expectations. We argue here that the well-documented failures of ICTs in healthcare are partly attributable to the philosophical foundations of much health informatics research. Positivistic assumptions underpinning the design, implementation and evaluation of ICTs (in particular the notion that technology X has an impact which can be measured and reproduced in new settings), and the deterministic experimental and quasi-experimental study designs which follow from these assumptions, have inherent limitations when ICTs are part of complex social practices involving multiple human actors. We suggest that while experimental and quasi-experimental studies have an important place in health informatics research overall, ethnography is the preferred methodological approach for studying ICTs introduced into complex social systems. But for ethnographic approaches to be accepted and used to their full potential, many in the health informatics community will need to revisit their philosophical assumptions about what counts as research rigor. PMID:21521535
Model-Based Optimal Experimental Design for Complex Physical Systems
2015-12-03
for public release. magnitude reduction in estimator error required to make solving the exact optimal design problem tractable. Instead of using a naive...for designing a sequence of experiments uses suboptimal approaches: batch design that has no feedback, or greedy ( myopic ) design that optimally...approved for public release. Equation 1 is difficult to solve directly, but can be expressed in an equivalent form using the principle of dynamic programming
Computer aided design of extrusion forming tools for complex geometry profiles
NASA Astrophysics Data System (ADS)
Goncalves, Nelson Daniel Ferreira
In the profile extrusion, the experience of the die designer is crucial for obtaining good results. In industry, it is quite usual the need of several experimental trials for a specific extrusion die before a balanced flow distribution is obtained. This experimental based trial-and-error procedure is time and money consuming, but, it works, and most of the profile extrusion companies rely on such method. However, the competition is forcing the industry to look for more effective procedures and the design of profile extrusion dies is not an exception. For this purpose, computer aided design seems to be a good route. Nowadays, the available computational rheology numerical codes allow the simulation of complex fluid flows. This permits the die designer to evaluate and to optimize the flow channel, without the need to have a physical die and to perform real extrusion trials. In this work, a finite volume based numerical code was developed, for the simulation of non-Newtonian (inelastic) fluid and non-isothermal flows using unstructured meshes. The developed code is able to model the forming and cooling stages of profile extrusion, and can be used to aid the design of forming tools used in the production of complex profiles. For the code verification three benchmark problems were tested: flow between parallel plates, flow around a cylinder, and the lid driven cavity flow. The code was employed to design two extrusion dies to produce complex cross section profiles: a medical catheter die and a wood plastic composite profile for decking applications. The last was experimentally validated. Simple extrusion dies used to produced L and T shaped profiles were studied in detail, allowing a better understanding of the effect of the main geometry parameters on the flow distribution. To model the cooling stage a new implicit formulation was devised, which allowed the achievement of better convergence rates and thus the reduction of the computation times. Having in mind the solution of large dimension problems, the code was parallelized using graphics processing units (GPUs). Speedups of ten times could be obtained, drastically decreasing the time required to obtain results.
Roles of Working Memory Performance and Instructional Strategy in Complex Cognitive Task Performance
ERIC Educational Resources Information Center
Cevik, V.; Altun, A.
2016-01-01
This study aims to investigate how working memory (WM) performances and instructional strategy choices affect learners' complex cognitive task performance in online environments. Three different e-learning environments were designed based on Merrill's (2006a) model of instructional strategies. The lack of experimental research on his framework is…
Rational design and dynamics of self-propelled colloidal bead chains: from rotators to flagella.
Vutukuri, Hanumantha Rao; Bet, Bram; van Roij, René; Dijkstra, Marjolein; Huck, Wilhelm T S
2017-12-01
The quest for designing new self-propelled colloids is fuelled by the demand for simple experimental models to study the collective behaviour of their more complex natural counterparts. Most synthetic self-propelled particles move by converting the input energy into translational motion. In this work we address the question if simple self-propelled spheres can assemble into more complex structures that exhibit rotational motion, possibly coupled with translational motion as in flagella. We exploit a combination of induced dipolar interactions and a bonding step to create permanent linear bead chains, composed of self-propelled Janus spheres, with a well-controlled internal structure. Next, we study how flexibility between individual swimmers in a chain can affect its swimming behaviour. Permanent rigid chains showed only active rotational or spinning motion, whereas longer semi-flexible chains showed both translational and rotational motion resembling flagella like-motion, in the presence of the fuel. Moreover, we are able to reproduce our experimental results using numerical calculations with a minimal model, which includes full hydrodynamic interactions with the fluid. Our method is general and opens a new way to design novel self-propelled colloids with complex swimming behaviours, using different complex starting building blocks in combination with the flexibility between them.
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Landman, Drew
2015-01-01
Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.
The Modern Design of Experiments for Configuration Aerodynamics: A Case Study
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2006-01-01
The effects of slowly varying and persisting covariate effects on the accuracy and precision of experimental result is reviewed, as is the rationale for run-order randomization as a quality assurance tactic employed in the Modern Design of Experiments (MDOE) to defend against such effects. Considerable analytical complexity is introduced by restrictions on randomization in configuration aerodynamics tests because they involve hard-to-change configuration variables that cannot be randomized conveniently. Tradeoffs are examined between quality and productivity associated with varying degrees of rigor in accounting for such randomization restrictions. Certain characteristics of a configuration aerodynamics test are considered that may justify a relaxed accounting for randomization restrictions to achieve a significant reduction in analytical complexity with a comparably negligible adverse impact on the validity of the experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noe, F; Diadone, Isabella; Lollmann, Marc
There is a gap between kinetic experiment and simulation in their views of the dynamics of complex biomolecular systems. Whereas experiments typically reveal only a few readily discernible exponential relaxations, simulations often indicate complex multistate behavior. Here, a theoretical framework is presented that reconciles these two approaches. The central concept is dynamical fingerprints which contain peaks at the time scales of the dynamical processes involved with amplitudes determined by the experimental observable. Fingerprints can be generated from both experimental and simulation data, and their comparison by matching peaks permits assignment of structural changes present in the simulation to experimentally observedmore » relaxation processes. The approach is applied here to a test case interpreting single molecule fluorescence correlation spectroscopy experiments on a set of fluorescent peptides with molecular dynamics simulations. The peptides exhibit complex kinetics shown to be consistent with the apparent simplicity of the experimental data. Moreover, the fingerprint approach can be used to design new experiments with site-specific labels that optimally probe specific dynamical processes in the molecule under investigation.« less
A Combined Theoretical and Experimental Study for Silver Electroplating
Liu, Anmin; Ren, Xuefeng; An, Maozhong; Zhang, Jinqiu; Yang, Peixia; Wang, Bo; Zhu, Yongming; Wang, Chong
2014-01-01
A novel method combined theoretical and experimental study for environmental friendly silver electroplating was introduced. Quantum chemical calculations and molecular dynamic (MD) simulations were employed for predicting the behaviour and function of the complexing agents. Electronic properties, orbital information, and single point energies of the 5,5-dimethylhydantoin (DMH), nicotinic acid (NA), as well as their silver(I)-complexes were provided by quantum chemical calculations based on density functional theory (DFT). Adsorption behaviors of the agents on copper and silver surfaces were investigated using MD simulations. Basing on the data of quantum chemical calculations and MD simulations, we believed that DMH and NA could be the promising complexing agents for silver electroplating. The experimental results, including of electrochemical measurement and silver electroplating, further confirmed the above prediction. This efficient and versatile method thus opens a new window to study or design complexing agents for generalized metal electroplating and will vigorously promote the level of this research region. PMID:24452389
Studying light-harvesting models with superconducting circuits.
Potočnik, Anton; Bargerbos, Arno; Schröder, Florian A Y N; Khan, Saeed A; Collodo, Michele C; Gasparinetti, Simone; Salathé, Yves; Creatore, Celestino; Eichler, Christopher; Türeci, Hakan E; Chin, Alex W; Wallraff, Andreas
2018-03-02
The process of photosynthesis, the main source of energy in the living world, converts sunlight into chemical energy. The high efficiency of this process is believed to be enabled by an interplay between the quantum nature of molecular structures in photosynthetic complexes and their interaction with the environment. Investigating these effects in biological samples is challenging due to their complex and disordered structure. Here we experimentally demonstrate a technique for studying photosynthetic models based on superconducting quantum circuits, which complements existing experimental, theoretical, and computational approaches. We demonstrate a high degree of freedom in design and experimental control of our approach based on a simplified three-site model of a pigment protein complex with realistic parameters scaled down in energy by a factor of 10 5 . We show that the excitation transport between quantum-coherent sites disordered in energy can be enabled through the interaction with environmental noise. We also show that the efficiency of the process is maximized for structured noise resembling intramolecular phononic environments found in photosynthetic complexes.
Update of the ATTRACT force field for the prediction of protein-protein binding affinity.
Chéron, Jean-Baptiste; Zacharias, Martin; Antonczak, Serge; Fiorucci, Sébastien
2017-06-05
Determining the protein-protein interactions is still a major challenge for molecular biology. Docking protocols has come of age in predicting the structure of macromolecular complexes. However, they still lack accuracy to estimate the binding affinities, the thermodynamic quantity that drives the formation of a complex. Here, an updated version of the protein-protein ATTRACT force field aiming at predicting experimental binding affinities is reported. It has been designed on a dataset of 218 protein-protein complexes. The correlation between the experimental and predicted affinities reaches 0.6, outperforming most of the available protocols. Focusing on a subset of rigid and flexible complexes, the performance raises to 0.76 and 0.69, respectively. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Braithwaite, Miles C; Kumar, Pradeep; Choonara, Yahya E; du Toit, Lisa C; Tomar, Lomas K; Tyagi, Charu; Pillay, Viness
2017-10-30
This study was conducted to provide a mechanistic account for understanding the synthesis, characterization and solubility phenomena of vitamin complexes with cyclodextrins (CD) for enhanced solubility and stability employing experimental and in silico molecular modeling strategies. New geometric, molecular and energetic analyses were pursued to explicate experimentally derived cholecalciferol complexes. Various CD molecules (α-, β-, γ-, and hydroxypropyl β-) were complexed with three vitamins: cholecalciferol, ascorbic acid and α-tocopherol. The Inclusion Efficiency (IE%) was computed for each CD-vitamin complex. The highest IE% achieved for a cholecalciferol complex was for 'βCDD 3 -8', after utilizing a unique CD:cholecalciferol molar synthesis ratio of 2.5:1, never before reported as successful. 2HPβCD-cholecalciferol, γCD-cholecalciferol and α-tocopherol inclusion complexes (IC's) reached maximal IE% with a CD:vitamin molar ratio of 5:1. The results demonstrate that IE%, thermal stability, concentration, carrier solubility, molecular mechanics and intended release profile are key factors to consider when synthesizing vitamin-CD complexes. Phase-solubility data provided insights into the design of formulations with IC's that may provide analogous oral vitamin release profiles even when hydrophobic and hydrophilic vitamins are co-incorporated. Static lattice atomistic simulations were able to validate experimentally derived cholecalciferol IE phenomena and are invaluable parameters when approaching formulation strategies using CD's for improved solubility and efficacy of vitamins. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-12-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.
Gimenez-Pinto, Vianney; Ye, Fangfu; Mbanga, Badel; Selinger, Jonathan V.; Selinger, Robin L. B.
2017-01-01
Various experimental and theoretical studies demonstrate that complex stimulus-responsive out-of-plane distortions such as twist of different chirality, emergence of cones, simple and anticlastic bending can be engineered and pre-programmed in a liquid crystalline rubbery material given a well-controlled director microstructure. Via 3-d finite element simulation studies, we demonstrate director-encoded chiral shape actuation in thin-film nematic polymer networks under external stimulus. Furthermore, we design two complex director fields with twisted nematic domains and nematic disclinations that encode a pattern of folds for an auto-origami box. This actuator will be flat at a reference nematic state and form four well-controlled bend distortions as orientational order changes. Device fabrication is applicable via current experimental techniques. These results are in qualitative agreement with theoretical predictions, provide insight into experimental observations, and demonstrate the value of finite element methods at the continuum level for designing and engineering liquid crystal polymeric devices. PMID:28349949
Active mixing of complex fluids at the microscale
Ober, Thomas J.; Foresti, Daniele; Lewis, Jennifer A.
2015-09-22
Mixing of complex fluids at low Reynolds number is fundamental for a broad range of applications, including materials assembly, microfluidics, and biomedical devices. Of these materials, yield stress fluids (and gels) pose the most significant challenges, especially when they must be mixed in low volumes over short timescales. New scaling relationships between mixer dimensions and operating conditions are derived and experimentally verified to create a framework for designing active microfluidic mixers that can efficiently homogenize a wide range of complex fluids. As a result, active mixing printheads are then designed and implemented for multimaterial 3D printing of viscoelastic inks withmore » programmable control of local composition.« less
Active mixing of complex fluids at the microscale
Ober, Thomas J.; Foresti, Daniele; Lewis, Jennifer A.
2015-01-01
Mixing of complex fluids at low Reynolds number is fundamental for a broad range of applications, including materials assembly, microfluidics, and biomedical devices. Of these materials, yield stress fluids (and gels) pose the most significant challenges, especially when they must be mixed in low volumes over short timescales. New scaling relationships between mixer dimensions and operating conditions are derived and experimentally verified to create a framework for designing active microfluidic mixers that can efficiently homogenize a wide range of complex fluids. Active mixing printheads are then designed and implemented for multimaterial 3D printing of viscoelastic inks with programmable control of local composition. PMID:26396254
ERIC Educational Resources Information Center
Bozorgian, Hossein; Pillay, Hitendra
2013-01-01
Listening used in language teaching refers to a complex process that allows us to understand spoken language. The current study, conducted in Iran with an experimental design, investigated the effectiveness of teaching listening strategies delivered in L1 (Persian) and its effect on listening comprehension in L2. Five listening strategies:…
The problem of site variation within red pine provenance experiments
Mark J. Holst
1966-01-01
In spite of care taken in the selection of site and experimental design of provenance experiments, site heterogenity within the experimental area may be more complex than was anticipated when the experiment was established. The present paper describes a problem of this nature encountered in a red pine (Pinus resinosa Ait.) provenance experiment at...
Research and application of embedded real-time operating system
NASA Astrophysics Data System (ADS)
Zhang, Bo
2013-03-01
In this paper, based on the analysis of existing embedded real-time operating system, the architecture of an operating system is designed and implemented. The experimental results show that the design fully complies with the requirements of embedded real-time operating system, can achieve the purposes of reducing the complexity of embedded software design and improving the maintainability, reliability, flexibility. Therefore, this design program has high practical value.
ERIC Educational Resources Information Center
Perret, Patrick; Bailleux, Christine; Dauvier, Bruno
2011-01-01
The present study focused on children's deductive reasoning when performing the Latin Square Task, an experimental task designed to explore the influence of relational complexity. Building on Birney, Halford, and Andrew's (2006) research, we created a version of the task that minimized nonrelational factors and introduced new categories of items.…
Modeling the assembly order of multimeric heteroprotein complexes
Esquivel-Rodriguez, Juan; Terashi, Genki; Christoffer, Charles; Shin, Woong-Hee
2018-01-01
Protein-protein interactions are the cornerstone of numerous biological processes. Although an increasing number of protein complex structures have been determined using experimental methods, relatively fewer studies have been performed to determine the assembly order of complexes. In addition to the insights into the molecular mechanisms of biological function provided by the structure of a complex, knowing the assembly order is important for understanding the process of complex formation. Assembly order is also practically useful for constructing subcomplexes as a step toward solving the entire complex experimentally, designing artificial protein complexes, and developing drugs that interrupt a critical step in the complex assembly. There are several experimental methods for determining the assembly order of complexes; however, these techniques are resource-intensive. Here, we present a computational method that predicts the assembly order of protein complexes by building the complex structure. The method, named Path-LzerD, uses a multimeric protein docking algorithm that assembles a protein complex structure from individual subunit structures and predicts assembly order by observing the simulated assembly process of the complex. Benchmarked on a dataset of complexes with experimental evidence of assembly order, Path-LZerD was successful in predicting the assembly pathway for the majority of the cases. Moreover, when compared with a simple approach that infers the assembly path from the buried surface area of subunits in the native complex, Path-LZerD has the strong advantage that it can be used for cases where the complex structure is not known. The path prediction accuracy decreased when starting from unbound monomers, particularly for larger complexes of five or more subunits, for which only a part of the assembly path was correctly identified. As the first method of its kind, Path-LZerD opens a new area of computational protein structure modeling and will be an indispensable approach for studying protein complexes. PMID:29329283
Modeling the assembly order of multimeric heteroprotein complexes.
Peterson, Lenna X; Togawa, Yoichiro; Esquivel-Rodriguez, Juan; Terashi, Genki; Christoffer, Charles; Roy, Amitava; Shin, Woong-Hee; Kihara, Daisuke
2018-01-01
Protein-protein interactions are the cornerstone of numerous biological processes. Although an increasing number of protein complex structures have been determined using experimental methods, relatively fewer studies have been performed to determine the assembly order of complexes. In addition to the insights into the molecular mechanisms of biological function provided by the structure of a complex, knowing the assembly order is important for understanding the process of complex formation. Assembly order is also practically useful for constructing subcomplexes as a step toward solving the entire complex experimentally, designing artificial protein complexes, and developing drugs that interrupt a critical step in the complex assembly. There are several experimental methods for determining the assembly order of complexes; however, these techniques are resource-intensive. Here, we present a computational method that predicts the assembly order of protein complexes by building the complex structure. The method, named Path-LzerD, uses a multimeric protein docking algorithm that assembles a protein complex structure from individual subunit structures and predicts assembly order by observing the simulated assembly process of the complex. Benchmarked on a dataset of complexes with experimental evidence of assembly order, Path-LZerD was successful in predicting the assembly pathway for the majority of the cases. Moreover, when compared with a simple approach that infers the assembly path from the buried surface area of subunits in the native complex, Path-LZerD has the strong advantage that it can be used for cases where the complex structure is not known. The path prediction accuracy decreased when starting from unbound monomers, particularly for larger complexes of five or more subunits, for which only a part of the assembly path was correctly identified. As the first method of its kind, Path-LZerD opens a new area of computational protein structure modeling and will be an indispensable approach for studying protein complexes.
NASA Astrophysics Data System (ADS)
Xu, Y. L.; Huang, Q.; Zhan, S.; Su, Z. Q.; Liu, H. J.
2014-06-01
How to use control devices to enhance system identification and damage detection in relation to a structure that requires both vibration control and structural health monitoring is an interesting yet practical topic. In this study, the possibility of using the added stiffness provided by control devices and frequency response functions (FRFs) to detect damage in a building complex was explored experimentally. Scale models of a 12-storey main building and a 3-storey podium structure were built to represent a building complex. Given that the connection between the main building and the podium structure is most susceptible to damage, damage to the building complex was experimentally simulated by changing the connection stiffness. To simulate the added stiffness provided by a semi-active friction damper, a steel circular ring was designed and used to add the related stiffness to the building complex. By varying the connection stiffness using an eccentric wheel excitation system and by adding or not adding the circular ring, eight cases were investigated and eight sets of FRFs were measured. The experimental results were used to detect damage (changes in connection stiffness) using a recently proposed FRF-based damage detection method. The experimental results showed that the FRF-based damage detection method could satisfactorily locate and quantify damage.
Design and implementation of an experiment scheduling system for the ACTS satellite
NASA Technical Reports Server (NTRS)
Ringer, Mark J.
1994-01-01
The Advanced Communication Technology Satellite (ACTS) was launched on the 12th of September 1993 aboard STS-51. All events since that time have proceeded as planned with user operations commencing on December 6th, 1993. ACTS is a geosynchronous satellite designed to extend the state of the art in communication satellite design and is available to experimenters on a 'time/bandwidth available' basis. The ACTS satellite requires the advance scheduling of experimental activities based upon a complex set of resource, state, and activity constraints in order to ensure smooth operations. This paper describes the software system developed to schedule experiments for ACTS.
ERIC Educational Resources Information Center
Barhoumi, Chokri; Rossi, Pier Giuseppe
2013-01-01
The use of hypertext systems for learning and teaching complex and ill-structured domain of knowledge has been attracting attention in design of instruction. In this context, an experimental research has been conducted to explore the effectiveness of instructional design oriented hypertext systems. Cognitive flexibility hypertext theory is…
An Empirical Study of Eight Nonparametric Tests in Hierarchical Regression.
ERIC Educational Resources Information Center
Harwell, Michael; Serlin, Ronald C.
When normality does not hold, nonparametric tests represent an important data-analytic alternative to parametric tests. However, the use of nonparametric tests in educational research has been limited by the absence of easily performed tests for complex experimental designs and analyses, such as factorial designs and multiple regression analyses,…
Expert systems for superalloy studies
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Kaukler, William F.
1990-01-01
There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).
NASA Technical Reports Server (NTRS)
Humenik, F. M.; Bosque, M. A.
1983-01-01
Fundamental experimental data base for turbulent flow mixing models is provided and better prediction of the more complex turbulent chemical reacting flows. Analytical application to combustor design is provided and a better fundamental understanding of the combustion process.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
NASA Astrophysics Data System (ADS)
Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi
2018-04-01
With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.
Designing Successful Proteomics Experiments.
Ruderman, Daniel
2017-01-01
Because proteomics experiments are so complex they can readily fail, and do so without clear cause. Using standard experimental design techniques and incorporating quality control can greatly increase the chances of success. This chapter introduces the relevant concepts and provides examples specific to proteomic workflows. Applying these notions to design successful proteomics experiments is straightforward. It can help identify failure causes and greatly increase the likelihood of inter-laboratory reproducibility.
Melero, Cristina; Ollikainen, Noah; Harwood, Ian; ...
2014-10-13
Re-engineering protein–protein recognition is an important route to dissecting and controlling complex interaction networks. Experimental approaches have used the strategy of “second-site suppressors,” where a functional interaction is inferred between two proteins if a mutation in one protein can be compensated by a mutation in the second. Mimicking this strategy, computational design has been applied successfully to change protein recognition specificity by predicting such sets of compensatory mutations in protein–protein interfaces. To extend this approach, it would be advantageous to be able to “transplant” existing engineered and experimentally validated specificity changes to other homologous protein–protein complexes. Here, we test thismore » strategy by designing a pair of mutations that modulates peptide recognition specificity in the Syntrophin PDZ domain, confirming the designed interaction biochemically and structurally, and then transplanting the mutations into the context of five related PDZ domain–peptide complexes. We find a wide range of energetic effects of identical mutations in structurally similar positions, revealing a dramatic context dependence (epistasis) of designed mutations in homologous protein–protein interactions. To better understand the structural basis of this context dependence, we apply a structure-based computational model that recapitulates these energetic effects and we use this model to make and validate forward predictions. The context dependence of these mutations is captured by computational predictions, our results both highlight the considerable difficulties in designing protein–protein interactions and provide challenging benchmark cases for the development of improved protein modeling and design methods that accurately account for the context.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melero, Cristina; Ollikainen, Noah; Harwood, Ian
Re-engineering protein–protein recognition is an important route to dissecting and controlling complex interaction networks. Experimental approaches have used the strategy of “second-site suppressors,” where a functional interaction is inferred between two proteins if a mutation in one protein can be compensated by a mutation in the second. Mimicking this strategy, computational design has been applied successfully to change protein recognition specificity by predicting such sets of compensatory mutations in protein–protein interfaces. To extend this approach, it would be advantageous to be able to “transplant” existing engineered and experimentally validated specificity changes to other homologous protein–protein complexes. Here, we test thismore » strategy by designing a pair of mutations that modulates peptide recognition specificity in the Syntrophin PDZ domain, confirming the designed interaction biochemically and structurally, and then transplanting the mutations into the context of five related PDZ domain–peptide complexes. We find a wide range of energetic effects of identical mutations in structurally similar positions, revealing a dramatic context dependence (epistasis) of designed mutations in homologous protein–protein interactions. To better understand the structural basis of this context dependence, we apply a structure-based computational model that recapitulates these energetic effects and we use this model to make and validate forward predictions. The context dependence of these mutations is captured by computational predictions, our results both highlight the considerable difficulties in designing protein–protein interactions and provide challenging benchmark cases for the development of improved protein modeling and design methods that accurately account for the context.« less
High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters
2017-04-22
signatures which can be used for direct, non -invasive, comparison with experimental diagnostics can be produced. This research will be directly... experimental campaign is critical to developing general design philosophies for low-power plasmoid formation, the complexity of non -linear plasma processes...advanced space propulsion. The work consists of numerical method development, physical model development, and systematic studies of the non -linear
Simulations of DNA stretching by flow field in microchannels with complex geometry.
Huang, Chiou-De; Kang, Dun-Yen; Hsieh, Chih-Chen
2014-01-01
Recently, we have reported the experimental results of DNA stretching by flow field in three microchannels (C. H. Lee and C. C. Hsieh, Biomicrofluidics 7(1), 014109 (2013)) designed specifically for the purpose of preconditioning DNA conformation for easier stretching. The experimental results do not only demonstrate the superiority of the new devices but also provides detailed observation of DNA behavior in complex flow field that was not available before. In this study, we use Brownian dynamics-finite element method (BD-FEM) to simulate DNA behavior in these microchannels, and compare the results against the experiments. Although the hydrodynamic interaction (HI) between DNA segments and between DNA and the device boundaries was not included in the simulations, the simulation results are in fairly good agreement with the experimental data from either the aspect of the single molecule behavior or from the aspect of ensemble averaged properties. The discrepancy between the simulation and the experimental results can be explained by the neglect of HI effect in the simulations. Considering the huge savings on the computational cost from neglecting HI, we conclude that BD-FEM can be used as an efficient and economic designing tool for developing new microfluidic device for DNA manipulation.
ERIC Educational Resources Information Center
Johnson, Mark D.; Mercado, Leonardo; Acevedo, Anthony
2012-01-01
This study contributes to L2 writing research which seeks to tie predictions of the Limited Attentional Capacity Model (Skehan, 1998; Skehan & Foster, 2001) and Cognition Hypothesis (Robinson, 2001, 2005, 2011a, 2011b) to models of working memory in L1 writing (Kellogg, 1996). The study uses a quasi-experimental research design to investigate…
Aeropropulsion 1987. Session 2: Aeropropulsion Structures Research
NASA Technical Reports Server (NTRS)
1987-01-01
Aeropropulsion systems present unique problems to the structural engineer. The extremes in operating temperatures, rotational effects, and behaviors of advanced material systems combine into complexities that require advances in many scientific disciplines involved in structural analysis and design procedures. This session provides an overview of the complexities of aeropropulsion structures and the theoretical, computational, and experimental research conducted to achieve the needed advances.
ERIC Educational Resources Information Center
Mitchell, Christine M.; Govindaraj, T.
1990-01-01
Discusses the use of intelligent tutoring systems as opposed to traditional on-the-job training for training operators of complex dynamic systems and describes the computer architecture for a system for operators of a NASA (National Aeronautics and Space Administration) satellite control system. An experimental evaluation with college students is…
Ge, Hongyu; Chen, Xiangyang; Yang, Xinzheng
2016-10-13
A series of cobalt and manganese cyclopentadienone complexes are proposed and examined computationally as promising catalysts for hydrogenation of CO 2 to formic acid with total free energies as low as 20.0 kcal mol -1 in aqueous solution. Density functional theory study of the newly designed cobalt and manganese complexes and experimentally reported iron cyclopentadienone complexes reveals a stepwise hydride transfer mechanism with a water or a methanol molecule assisted proton transfer for the cleavage of H 2 as the rate-determining step.
Optimal design and experimental analyses of a new micro-vibration control payload-platform
NASA Astrophysics Data System (ADS)
Sun, Xiaoqing; Yang, Bintang; Zhao, Long; Sun, Xiaofen
2016-07-01
This paper presents a new payload-platform, for precision devices, which possesses the capability of isolating the complex space micro-vibration in low frequency range below 5 Hz. The novel payload-platform equipped with smart material actuators is investigated and designed through optimization strategy based on the minimum energy loss rate, for the aim of achieving high drive efficiency and reducing the effect of the magnetic circuit nonlinearity. Then, the dynamic model of the driving element is established by using the Lagrange method and the performance of the designed payload-platform is further discussed through the combination of the controlled auto regressive moving average (CARMA) model with modified generalized prediction control (MGPC) algorithm. Finally, an experimental prototype is developed and tested. The experimental results demonstrate that the payload-platform has an impressive potential of micro-vibration isolation.
Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems
NASA Technical Reports Server (NTRS)
Koch, Patrick N.
1997-01-01
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.
Glaholt, Stephen P; Chen, Celia Y; Demidenko, Eugene; Bugge, Deenie M; Folt, Carol L; Shaw, Joseph R
2012-08-15
The study of stressor interactions by eco-toxicologists using nonlinear response variables is limited by required amounts of a priori knowledge, complexity of experimental designs, the use of linear models, and the lack of use of optimal designs of nonlinear models to characterize complex interactions. Therefore, we developed AID, an adaptive-iterative design for eco-toxicologist to more accurately and efficiently examine complex multiple stressor interactions. AID incorporates the power of the general linear model and A-optimal criteria with an iterative process that: 1) minimizes the required amount of a priori knowledge, 2) simplifies the experimental design, and 3) quantifies both individual and interactive effects. Once a stable model is determined, the best fit model is identified and the direction and magnitude of stressors, individually and all combinations (including complex interactions) are quantified. To validate AID, we selected five commonly co-occurring components of polluted aquatic systems, three metal stressors (Cd, Zn, As) and two water chemistry parameters (pH, hardness) to be tested using standard acute toxicity tests in which Daphnia mortality is the (nonlinear) response variable. We found after the initial data input of experimental data, although literature values (e.g. EC-values) may also be used, and after only two iterations of AID, our dose response model was stable. The model ln(Cd)*ln(Zn) was determined the best predictor of Daphnia mortality response to the combined effects of Cd, Zn, As, pH, and hardness. This model was then used to accurately identify and quantify the strength of both greater- (e.g. As*Cd) and less-than additive interactions (e.g. Cd*Zn). Interestingly, our study found only binary interactions significant, not higher order interactions. We conclude that AID is more efficient and effective at assessing multiple stressor interactions than current methods. Other applications, including life-history endpoints commonly used by regulators, could benefit from AID's efficiency in assessing water quality criteria. Copyright © 2012 Elsevier B.V. All rights reserved.
A RANDOM-SCAN DISPLAY OF PREDICTED SATELLITE POSITIONS.
With the completion of the NRL evaluation of the experimental model of the Satellite Position Prediction and Display equipment ( SPAD ), efforts were...directed toward the design of an operational version of SPAD . Possible design and equipment configurations were proposed which would lead to a...substantial savings in cost and reduced equipment complexity. These designs involve the displaying of the SPAD information by means of a random scanning of
Development of a Naval C2 Capability Evaluation Facility
2014-06-01
designs is required in highly complex systems since sub-system evaluation may not be predictive of the overall system effect. It has been shown by...all individual and team behaviours, communications and interactions must be recordable. From the start of the project the design concept was for a...experimentation requirements of the concept evaluations being developed by the concept development team. A system design that allowed a variable fidelity in
Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter
2017-09-01
To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.
A Game Map Complexity Measure Based on Hamming Distance
NASA Astrophysics Data System (ADS)
Li, Yan; Su, Pan; Li, Wenliang
With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.
Moving bed reactor setup to study complex gas-solid reactions.
Gupta, Puneet; Velazquez-Vargas, Luis G; Valentine, Charles; Fan, Liang-Shih
2007-08-01
A moving bed scale reactor setup for studying complex gas-solid reactions has been designed in order to obtain kinetic data for scale-up purpose. In this bench scale reactor setup, gas and solid reactants can be contacted in a cocurrent and countercurrent manner at high temperatures. Gas and solid sampling can be performed through the reactor bed with their composition profiles determined at steady state. The reactor setup can be used to evaluate and corroborate model parameters accounting for intrinsic reaction rates in both simple and complex gas-solid reaction systems. The moving bed design allows experimentation over a variety of gas and solid compositions in a single experiment unlike differential bed reactors where the gas composition is usually fixed. The data obtained from the reactor can also be used for direct scale-up of designs for moving bed reactors.
NASA Technical Reports Server (NTRS)
Gelder, Thomas F.; Moore, Royce D.; Shyne, Rickey J.; Boldman, Donald R.
1987-01-01
Two turning vane designs were experimentally evaluated for the fan-drive corner (corner 2) coupled to an upstream diffuser and the high-speed corner (corner 1) of the 0.1 scale model of NASA Lewis Research Center's proposed Altitude Wind Tunnel. For corner 2 both a controlled-diffusion vane design (vane A4) and a circular-arc vane design (vane B) were studied. The corner 2 total pressure loss coefficient was about 0.12 with either vane design. This was about 25 percent less loss than when corner 2 was tested alone. Although the vane A4 design has the advantage of 20 percent fewer vanes than the vane B design, its vane shape is more complex. The effects of simulated inlet flow distortion on the overall losses for corner 1 or 2 were small.
DNA curtains for high-throughput single-molecule optical imaging.
Greene, Eric C; Wind, Shalom; Fazio, Teresa; Gorman, Jason; Visnapuu, Mari-Liis
2010-01-01
Single-molecule approaches provide a valuable tool in the arsenal of the modern biologist, and new discoveries continue to be made possible through the use of these state-of-the-art technologies. However, it can be inherently difficult to obtain statistically relevant data from experimental approaches specifically designed to probe individual reactions. This problem is compounded with more complex biochemical reactions, heterogeneous systems, and/or reactions requiring the use of long DNA substrates. Here we give an overview of a technology developed in our laboratory, which relies upon simple micro- or nanofabricated structures in combination with "bio-friendly" lipid bilayers, to align thousands of long DNA molecules into defined patterns on the surface of a microfluidic sample chamber. We call these "DNA curtains," and we have developed several different versions varying in complexity and DNA substrate configuration, which are designed to meet different experimental needs. This novel approach to single-molecule imaging provides a powerful experimental platform that offers the potential for concurrent observation of hundreds or even thousands of protein-DNA interactions in real time. Copyright 2010 Elsevier Inc. All rights reserved.
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-01-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944
Development of a new continuous process for mixing of complex non-Newtonian fluids
NASA Astrophysics Data System (ADS)
Migliozzi, Simona; Mazzei, Luca; Sochon, Bob; Angeli, Panagiota; Thames Multiphase Team; Coral Project Collaboration
2017-11-01
Design of new continuous mixing operations poses many challenges, especially when dealing with highly viscous non-Newtonian fluids. Knowledge of complex rheological behaviour of the working mixture is crucial for development of an efficient process. In this work, we investigate the mixing performance of two different static mixers and the effects of the mixture rheology on the manufacturing of novel non-aqueous-based oral care products using experimental and computational fluid dynamic methods. The two liquid phases employed, i.e. a carbomer suspension in polyethylene glycol and glycerol, start to form a gel when they mix. We studied the structure evolution of the liquid mixture using time-resolved rheometry and we obtained viscosity rheograms at different phase ratios from pressure drop measurements in a customized mini-channel. The numerical results and rheological model were validated with experimental measurements carried out in a specifically designed setup. EPSRS-CORAL.
Study/experimental/research design: much more than statistics.
Knight, Kenneth L
2010-01-01
The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes "Methods" sections hard to read and understand. To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results.
ERIC Educational Resources Information Center
Doganca Kucuk, Zerrin; Saysel, Ali Kerem
2018-01-01
A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A…
Intuitive web-based experimental design for high-throughput biomedical data.
Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven
2015-01-01
Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.
Evaluation of complex community-based childhood obesity prevention interventions.
Karacabeyli, D; Allender, S; Pinkney, S; Amed, S
2018-05-16
Multi-setting, multi-component community-based interventions have shown promise in preventing childhood obesity; however, evaluation of these complex interventions remains a challenge. The objective of the study is to systematically review published methodological approaches to outcome evaluation for multi-setting community-based childhood obesity prevention interventions and synthesize a set of pragmatic recommendations. MEDLINE, CINAHL and PsycINFO were searched from inception to 6 July 2017. Papers were included if the intervention targeted children ≤18 years, engaged at least two community sectors and described their outcome evaluation methodology. A single reviewer conducted title and abstract scans, full article review and data abstraction. Directed content analysis was performed by three reviewers to identify prevailing themes. Thirty-three studies were included, and of these, 26 employed a quasi-experimental design; the remaining were randomized control trials. Body mass index was the most commonly measured outcome, followed by health behaviour change and psychosocial outcomes. Six themes emerged, highlighting advantages and disadvantages of active vs. passive consent, quasi-experimental vs. randomized control trials, longitudinal vs. repeat cross-sectional designs and the roles of process evaluation and methodological flexibility in evaluating complex interventions. Selection of study designs and outcome measures compatible with community infrastructure, accompanied by process evaluation, may facilitate successful outcome evaluation. © 2018 World Obesity Federation.
Mennini, N; Furlanetto, S; Cirri, M; Mura, P
2012-01-01
The aim of the present work was to develop a new multiparticulate system, designed for colon-specific delivery of celecoxib for both systemic (in chronotherapic treatment of arthritis) and local (in prophylaxis of colon carcinogenesis) therapy. The system simultaneously benefits from ternary complexation with hydroxypropyl-β-cyclodextrin and PVP (polyvinylpyrrolidone), to increase drug solubility, and vectorization in chitosan-Ca-alginate microspheres, to exploit the colon-specific carrier properties of these polymers. Statistical experimental design was employed to investigate the combined effect of four formulation variables, i.e., % of alginate, CaCl₂, and chitosan and time of cross-linking on microsphere entrapment efficiency (EE%) and drug amount released after 4h in colonic medium, considered as the responses to be optimized. Design of experiment was used in the context of Quality by Design, which requires a multivariate approach for understanding the multifactorial relationships among formulation parameters. Doehlert design allowed for defining a design space, which revealed that variations of the considered factors had in most cases an opposite influence on the responses. Desirability function was used to attain simultaneous optimization of both responses. The desired goals were achieved for both systemic and local use of celecoxib. Experimental values obtained from the optimized formulations were in both cases very close to the predicted values, thus confirming the validity of the generated mathematical model. These results demonstrated the effectiveness of the proposed jointed use of drug cyclodextrin complexation and chitosan-Ca-alginate microsphere vectorization, as well as the usefulness of the multivariate approach for the preparation of colon-targeted celecoxib microspheres with optimized properties. Copyright © 2011 Elsevier B.V. All rights reserved.
Exploiting Complexity Information for Brain Activation Detection
Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui
2016-01-01
We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838
A Robust Adaptive Autonomous Approach to Optimal Experimental Design
NASA Astrophysics Data System (ADS)
Gu, Hairong
Experimentation is the fundamental tool of scientific inquiries to understand the laws governing the nature and human behaviors. Many complex real-world experimental scenarios, particularly in quest of prediction accuracy, often encounter difficulties to conduct experiments using an existing experimental procedure for the following two reasons. First, the existing experimental procedures require a parametric model to serve as the proxy of the latent data structure or data-generating mechanism at the beginning of an experiment. However, for those experimental scenarios of concern, a sound model is often unavailable before an experiment. Second, those experimental scenarios usually contain a large number of design variables, which potentially leads to a lengthy and costly data collection cycle. Incompetently, the existing experimental procedures are unable to optimize large-scale experiments so as to minimize the experimental length and cost. Facing the two challenges in those experimental scenarios, the aim of the present study is to develop a new experimental procedure that allows an experiment to be conducted without the assumption of a parametric model while still achieving satisfactory prediction, and performs optimization of experimental designs to improve the efficiency of an experiment. The new experimental procedure developed in the present study is named robust adaptive autonomous system (RAAS). RAAS is a procedure for sequential experiments composed of multiple experimental trials, which performs function estimation, variable selection, reverse prediction and design optimization on each trial. Directly addressing the challenges in those experimental scenarios of concern, function estimation and variable selection are performed by data-driven modeling methods to generate a predictive model from data collected during the course of an experiment, thus exempting the requirement of a parametric model at the beginning of an experiment; design optimization is performed to select experimental designs on the fly of an experiment based on their usefulness so that fewest designs are needed to reach useful inferential conclusions. Technically, function estimation is realized by Bayesian P-splines, variable selection is realized by Bayesian spike-and-slab prior, reverse prediction is realized by grid-search and design optimization is realized by the concepts of active learning. The present study demonstrated that RAAS achieves statistical robustness by making accurate predictions without the assumption of a parametric model serving as the proxy of latent data structure while the existing procedures can draw poor statistical inferences if a misspecified model is assumed; RAAS also achieves inferential efficiency by taking fewer designs to acquire useful statistical inferences than non-optimal procedures. Thus, RAAS is expected to be a principled solution to real-world experimental scenarios pursuing robust prediction and efficient experimentation.
Design of high-strength refractory complex solid-solution alloys
Singh, Prashant; Sharma, Aayush; Smirnov, A. V.; ...
2018-03-28
Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less
Design of high-strength refractory complex solid-solution alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Prashant; Sharma, Aayush; Smirnov, A. V.
Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less
NASA Astrophysics Data System (ADS)
Doganca Kucuk, Zerrin; Saysel, Ali Kerem
2017-03-01
A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A quasi-experimental methodology was used to compare performances of the participants in various dimensions, including systems thinking skills, competence in dynamic environmental problem solving and success in science achievement tests. The same pre-, post- and delayed tests were used with both the comparison and experimental groups in the same public middle school in Istanbul. Classroom activities designed for the comparison group (N = 20) followed the directives of the Science and Technology Curriculum, while the experimental group (N = 22) covered the same subject matter through activities benefiting from systems tools and representations such as behaviour over time graphs, causal loop diagrams, stock-flow structures and hands-on dynamic modelling. After a one-month systems-based instruction, the experimental group demonstrated significantly better systems thinking and dynamic environmental problem solving skills. Achievement in dynamic problem solving was found to be relatively stable over time. However, standard science achievement did not improve at all. This paper focuses on the quantitative analysis of the results, the weaknesses of the curriculum and educational implications.
ERIC Educational Resources Information Center
Yanson, Regina
2012-01-01
For e-learning initiatives to succeed, they must be designed to support a variety of trainees, methods, and content. Two important considerations in the design of any learning environment are the complexity of the tasks being learned and the socialization and connections of the trainees. Therefore, the goal of this research was to investigate how…
Lahey, Benjamin B.; Turkheimer, Eric; Lichtenstein, Paul
2013-01-01
Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene–environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles. PMID:23927516
D'Onofrio, Brian M; Lahey, Benjamin B; Turkheimer, Eric; Lichtenstein, Paul
2013-10-01
Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene-environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles.
NASA Astrophysics Data System (ADS)
Roesch, Frank; Nerb, Josef; Riess, Werner
2015-03-01
Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of experimental problem-solving ability better than conventional lessons in science. We used a paper-and-pencil test to assess students' abilities in a quasi-experimental intervention study utilizing a pretest/posttest control-group design (N = 340; average performing sixth-grade students). The treatment group received lessons on forest ecosystems consistent with the principle of education for sustainable development. This learning environment was expected to help students enhance their ecological knowledge and their theoretical and methodological experimental competencies. Two control groups received either the teachers' usual lessons on forest ecosystems or non-specific lessons on other science topics. We found that the treatment promoted specific components of experimental problem-solving ability (generating epistemic questions, planning two-factorial experiments, and identifying correct experimental controls). However, the observed effects were small, and awareness for aspects of higher ecological experimental validity was not promoted by the treatment.
Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree
2016-12-01
Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. © The Author(s) 2016.
Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree
2016-01-01
Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. PMID:27694614
The Penn State Safety Floor: Part I--Design parameters associated with walking deflections.
Casalena, J A; Ovaert, T C; Cavanagh, P R; Streit, D A
1998-08-01
A new flooring system has been developed to reduce peak impact forces to the hips when humans fall. The new safety floor is designed to remain relatively rigid under normal walking conditions, but to deform elastically when impacted during a fall. Design objectives included minimizing peak force experienced by the femur during a fall-induced impact, while maintaining a maximum of 2 mm of floor deflection during walking. Finite Element Models (FEMs) were developed to capture the complex dynamics of impact response between two deformable bodies. Validation of the finite element models included analytical calculations of theoretical buckling column response, experimental quasi-static loading of full-scale flooring prototypes, and flooring response during walking trials. Finite Element Method results compared well with theoretical and experimental data. Both finite element and experimental data suggest that the proposed safety floor can effectively meet the design goal of 2 mm maximum deflection during walking, while effectively reducing impact forces during a fall.
Army Field-Oriented S&T Experimentation Venues: A Comparative Analysis
2011-09-01
Microclimate Cooling Station (MCCS)). The Fort Benning AEWE provides the venue and the data collection and analysis. The costs to the S&T...forest, fields, etc.) and is designated as an Army experimental station with access to ground and an aerial fleet. Technology developers have optional...YTC), (2) tropical (the Tropic Regions Test Center, Panama Canal Zone), and (3) cold weather (CRTC, Bolio Lake Test Complex, AK. Special
Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L
2017-04-07
A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.
Clima, Lilia; Ursu, Elena L; Cojocaru, Corneliu; Rotaru, Alexandru; Barboiu, Mihail; Pinteala, Mariana
2015-09-28
The complexes formed by DNA and polycations have received great attention owing to their potential application in gene therapy. In this study, the binding efficiency between double-stranded oligonucleotides (dsDNA) and branched polyethylenimine (B-PEI) has been quantified by processing of the images captured from the gel electrophoresis assays. The central composite experimental design has been employed to investigate the effects of controllable factors on the binding efficiency. On the basis of experimental data and the response surface methodology, a multivariate regression model has been constructed and statistically validated. The model has enabled us to predict the binding efficiency depending on experimental factors, such as concentrations of dsDNA and B-PEI as well as the initial pH of solution. The optimization of the binding process has been performed using simplex and gradient methods. The optimal conditions determined for polyplex formation have yielded a maximal binding efficiency close to 100%. In order to reveal the mechanism of complex formation at the atomic-scale, a molecular dynamic simulation has been carried out. According to the computation results, B-PEI amine hydrogen atoms have interacted with oxygen atoms from dsDNA phosphate groups. These interactions have led to the formation of hydrogen bonds between macromolecules, stabilizing the polyplex structure.
Using machine learning tools to model complex toxic interactions with limited sampling regimes.
Bertin, Matthew J; Moeller, Peter; Guillette, Louis J; Chapman, Robert W
2013-03-19
A major impediment to understanding the impact of environmental stress, including toxins and other pollutants, on organisms, is that organisms are rarely challenged by one or a few stressors in natural systems. Thus, linking laboratory experiments that are limited by practical considerations to a few stressors and a few levels of these stressors to real world conditions is constrained. In addition, while the existence of complex interactions among stressors can be identified by current statistical methods, these methods do not provide a means to construct mathematical models of these interactions. In this paper, we offer a two-step process by which complex interactions of stressors on biological systems can be modeled in an experimental design that is within the limits of practicality. We begin with the notion that environment conditions circumscribe an n-dimensional hyperspace within which biological processes or end points are embedded. We then randomly sample this hyperspace to establish experimental conditions that span the range of the relevant parameters and conduct the experiment(s) based upon these selected conditions. Models of the complex interactions of the parameters are then extracted using machine learning tools, specifically artificial neural networks. This approach can rapidly generate highly accurate models of biological responses to complex interactions among environmentally relevant toxins, identify critical subspaces where nonlinear responses exist, and provide an expedient means of designing traditional experiments to test the impact of complex mixtures on biological responses. Further, this can be accomplished with an astonishingly small sample size.
Chang, Shu-Wei; Kuo, Shih-Yu; Huang, Ting-Hsuan
2017-01-01
This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future. PMID:29271937
Chang, Shu-Wei; Lin, Tzu-Kang; Kuo, Shih-Yu; Huang, Ting-Hsuan
2017-12-22
This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future.
NASA Astrophysics Data System (ADS)
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun
2017-12-01
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.
NASA Astrophysics Data System (ADS)
Zhou, Qunfei
First-principles calculations based on quantum mechanics have been proved to be powerful for accurately regenerating experimental results, uncovering underlying myths of experimental phenomena, and accelerating the design of innovative materials. This work has been motivated by the demand to design next-generation thermionic emitting cathodes and techniques to allow for synthesis of photo-responsive polymers on complex surfaces with controlled thickness and patterns. For Os-coated tungsten thermionic dispenser cathodes, we used first-principles methods to explore the bulk and surface properties of W-Os alloys in order to explain the previously observed experimental phenomena that thermionic emission varies significantly with W-Os alloy composition. Meanwhile, we have developed a new quantum mechanical approach to quantitatively predict the thermionic emission current density from materials perspective without any semi-empirical approximations or complicated analytical models, which leads to better understanding of thermionic emission mechanism. The methods from this work could be used to accelerate the design of next-generation thermionic cathodes. For photoresponsive materials, we designed a novel type of azobenzene-containing monomer for light-mediated ring-opening metathesis polymerization (ROMP) toward the fabrication of patterned, photo-responsive polymers by controlling ring strain energy (RSE) of the monomer that drives ROMP. This allows for unprecedented remote, noninvasive, instantaneous spatial and temporal control of photo-responsive polymer deposition on complex surfaces.This work on the above two different materials systems showed the power of quantum mechanical calculations on predicting, understanding and discovering the structures and properties of both known and unknown materials in a fast, efficient and reliable way.
Spanwise morphing trailing edge on a finite wing
NASA Astrophysics Data System (ADS)
Pankonien, Alexander M.; Inman, Daniel J.
2015-04-01
Unmanned Aerial Vehicles are prime targets for morphing implementation as they must adapt to large changes in flight conditions associated with locally varying wind or large changes in mass associated with payload delivery. The Spanwise Morphing Trailing Edge concept locally varies the trailing edge camber of a wing or control surface, functioning as a modular replacement for conventional ailerons without altering the spar box. Utilizing alternating active sections of Macro Fiber Composites (MFCs) driving internal compliant mechanisms and inactive sections of elastomeric honeycombs, the SMTE concept eliminates geometric discontinuities associated with shape change, increasing aerodynamic performance. Previous work investigated a representative section of the SMTE concept and investigated the effect of various skin designs on actuation authority. The current work experimentally evaluates the aerodynamic gains for the SMTE concept for a representative finite wing as compared with a conventional, articulated wing. The comparative performance for both wings is evaluated by measuring the drag penalty associated with achieving a design lift coefficient from an off-design angle of attack. To reduce experimental complexity, optimal control configurations are predicted with lifting line theory and experimentally measured control derivatives. Evaluated over a range of off-design flight conditions, this metric captures the comparative capability of both concepts to adapt or "morph" to changes in flight conditions. Even with this simplistic model, the SMTE concept is shown to reduce the drag penalty due to adaptation up to 20% at off-design conditions, justifying the increase in mass and complexity and motivating concepts capable of larger displacement ranges, higher fidelity modelling, and condition-sensing control.
NASA Astrophysics Data System (ADS)
Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim
2018-01-01
The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3 × 10- 3 mol L- 1 and 700 μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15 × 10- 2 mol L- 1 and 2.8 mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10- 7 and 10- 8 mol L- 1, respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples.
Xu, Yun; Muhamadali, Howbeer; Sayqal, Ali; Dixon, Neil; Goodacre, Royston
2016-10-28
Partial least squares (PLS) is one of the most commonly used supervised modelling approaches for analysing multivariate metabolomics data. PLS is typically employed as either a regression model (PLS-R) or a classification model (PLS-DA). However, in metabolomics studies it is common to investigate multiple, potentially interacting, factors simultaneously following a specific experimental design. Such data often cannot be considered as a "pure" regression or a classification problem. Nevertheless, these data have often still been treated as a regression or classification problem and this could lead to ambiguous results. In this study, we investigated the feasibility of designing a hybrid target matrix Y that better reflects the experimental design than simple regression or binary class membership coding commonly used in PLS modelling. The new design of Y coding was based on the same principle used by structural modelling in machine learning techniques. Two real metabolomics datasets were used as examples to illustrate how the new Y coding can improve the interpretability of the PLS model compared to classic regression/classification coding.
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1999-01-01
Unique and innovative graph theory, neural network, organizational modeling, and genetic algorithms are applied to the design and evolution of programmatic and organizational architectures. Graph theory representations of programs and organizations increase modeling capabilities and flexibility, while illuminating preferable programmatic/organizational design features. Treating programs and organizations as neural networks results in better system synthesis, and more robust data modeling. Organizational modeling using covariance structures enhances the determination of organizational risk factors. Genetic algorithms improve programmatic evolution characteristics, while shedding light on rulebase requirements for achieving specified technological readiness levels, given budget and schedule resources. This program of research improves the robustness and verifiability of systems synthesis tools, including the Complex Organizational Metric for Programmatic Risk Environments (COMPRE).
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
Experimental study of adaptive pointing and tracking for large flexible space structures
NASA Technical Reports Server (NTRS)
Boussalis, D.; Bayard, D. S.; Ih, C.; Wang, S. J.; Ahmed, A.
1991-01-01
This paper describes an experimental study of adaptive pointing and tracking control for flexible spacecraft conducted on a complex ground experiment facility. The algorithm used in this study is based on a multivariable direct model reference adaptive control law. Several experimental validation studies were performed earlier using this algorithm for vibration damping and robust regulation, with excellent results. The current work extends previous studies by addressing the pointing and tracking problem. As is consistent with an adaptive control framework, the plant is assumed to be poorly known to the extent that only system level knowledge of its dynamics is available. Explicit bounds on the steady-state pointing error are derived as functions of the adaptive controller design parameters. It is shown that good tracking performance can be achieved in an experimental setting by adjusting adaptive controller design weightings according to the guidelines indicated by the analytical expressions for the error.
NASA Technical Reports Server (NTRS)
Jones, Gregory S.; Yao, Chung-Sheng; Allan, Brian G.
2006-01-01
Recent efforts in extreme short takeoff and landing aircraft configurations have renewed the interest in circulation control wing design and optimization. The key to accurately designing and optimizing these configurations rests in the modeling of the complex physics of these flows. This paper will highlight the physics of the stagnation and separation regions on two typical circulation control airfoil sections.
Study/Experimental/Research Design: Much More Than Statistics
Knight, Kenneth L.
2010-01-01
Abstract Context: The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes “Methods” sections hard to read and understand. Objective: To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. Description: The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Advantages: Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results. PMID:20064054
Production of footbridge with double curvature made of UHPC
NASA Astrophysics Data System (ADS)
Kolísko, J.; Čítek, D.; Tej, P.; Rydval, M.
2017-09-01
This article present a mix design, preparation and production of thin-walled footbridge made from UHPFRC. In this case an experimental pedestrian bridge was design and prepared. Bridge with span of 10 m and the clear width of 1.50 m designed as single-span bridge. Optimization of UHPFRC matrix and parameters of this material leads to the design of very thin structures. Total thickness of shell structure 30 - 45 mm. Bridge was cast as a prefabricated element in one piece. Self-compacting character of UHPFRC with high flowability allows the production of the final structure. Extensive research was done before production of footbridge. Experimental reached data were compared with extensive numerical analysis and the final design of structure and UHPFRC matrix were optimized in many details. Two versions of large scale mock-ups were casted and tested. According to the complexity of whole experiment a casting technology and production of formwork were tested and optimized many times.
Optical fabrication of large area photonic microstructures by spliced lens
NASA Astrophysics Data System (ADS)
Jin, Wentao; Song, Meng; Zhang, Xuehua; Yin, Li; Li, Hong; Li, Lin
2018-05-01
We experimentally demonstrate a convenient approach to fabricate large area photorefractive photonic microstructures by a spliced lens device. Large area two-dimensional photonic microstructures are optically induced inside an iron-doped lithium niobate crystal. The experimental setups of our method are relatively compact and stable without complex alignment devices. It can be operated in almost any optical laboratories. We analyze the induced triangular lattice microstructures by plane wave guiding, far-field diffraction pattern imaging and Brillouin-zone spectroscopy. By designing the spliced lens appropriately, the method can be easily extended to fabricate other complex large area photonic microstructures, such as quasicrystal microstructures. Induced photonic microstructures can be fixed or erased and re-recorded in the photorefractive crystal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.
The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
Numerical investigation of cavitation flow in journal bearing geometry
NASA Astrophysics Data System (ADS)
Riedel, M.; Schmidt, M.; Stücke, P.
2013-04-01
The appearance of cavitation is still a problem in technical and industrial applications. Especially in automotive internal combustion engines, hydrodynamic journal bearings are used due to their favourable wearing quality and operating characteristics. Cavitation flows inside the bearings reduces the load capacity and leads to a risk of material damages. Therefore an understanding of the complex flow phenomena inside the bearing is necessary for the design development of hydrodynamic journal bearings. Experimental investigations in the fluid domain of the journal bearing are difficult to realize founded by the small dimensions of the bearing. In the recent years more and more the advantages of the computational fluid dynamics (CFD) are used to investigate the detail of the cavitation flows. The analysis in the paper is carried out in a two-step approach. At first an experimental investigation of journal bearing including cavitation is selected from the literature. The complex numerical model validated with the experimental measured data. In a second step, typically design parameters, such as a groove and feed hole, which are necessary to distribute the oil supply across the gap were added into the model. The paper reflects on the influence of the used design parameters and the variation of the additional supply flow rate through the feed hole regarding to cavitation effects in the bearing. Detailed pictures of the three-dimensional flow structures and the cavitation regions inside the flow film of the bearing are presented.
Complex interactions of multiple aquatic consumers: an experimental mesocosm manipulation
Richardson, William B.; Threlkeld, Stephen T.
1993-01-01
In 7-m3 outdoor tanks filled with lake water, the presence/absence of omnivorous young-of-the- year Micropterus salmoides), zooplanktivorous Menidia beryllina , and herbivorous larval Hyla chrysocelis was experimentally manipulated. A cross-classified design was used to assess the interactive effects of these vertebrate consumers on the experimental food webs. The primary effects of the experimental manipulations on food web components were two- and three-way interactions in which the effect of a given treatment was dependent on the presence of another treatment. Results suggest that the addition or removal of consumers may not cause linear, additive changes in food webs.
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops
NASA Astrophysics Data System (ADS)
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops.
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
2010-01-01
Background The aim of this paper is to provide the rationale for an evaluation design for a complex intervention program targeting loneliness among non-institutionalized elderly people in a Dutch community. Complex public health interventions characteristically use the combined approach of intervening on the individual and on the environmental level. It is assumed that the components of a complex intervention interact with and reinforce each other. Furthermore, implementation is highly context-specific and its impact is influenced by external factors. Although the entire community is exposed to the intervention components, each individual is exposed to different components with a different intensity. Methods/Design A logic model of change is used to develop the evaluation design. The model describes what outcomes may logically be expected at different points in time at the individual level. In order to address the complexity of a real-life setting, the evaluation design of the loneliness intervention comprises two types of evaluation studies. The first uses a quasi-experimental pre-test post-test design to evaluate the effectiveness of the overall intervention. A control community comparable to the intervention community was selected, with baseline measurements in 2008 and follow-up measurements scheduled for 2010. This study focuses on changes in the prevalence of loneliness and in the determinants of loneliness within individuals in the general elderly population. Complementarily, the second study is designed to evaluate the individual intervention components and focuses on delivery, reach, acceptance, and short-term outcomes. Different means of project records and surveys among participants are used to collect these data. Discussion Combining these two evaluation strategies has the potential to assess the effectiveness of the overall complex intervention and the contribution of the individual intervention components thereto. PMID:20836840
NASA Technical Reports Server (NTRS)
Aldrin, John C.; Williams, Phillip A.; Wincheski, Russell (Buzz) A.
2008-01-01
A case study is presented for using models in eddy current NDE design for crack detection in Shuttle Reaction Control System thruster components. Numerical methods were used to address the complex geometry of the part and perform parametric studies of potential transducer designs. Simulations were found to show agreement with experimental results. Accurate representation of the coherent noise associated with the measurement and part geometry was found to be critical to properly evaluate the best probe designs.
Wang, Shu-Jian; Li, Ying; Wu, Di; Wang, Yin-Feng; Li, Zhi-Ru
2012-09-13
By means of density functional theory, a hexanuclear sandwich complex [18]annulene-Li6-[18]annulene which consists of a central Li6 hexagon ring and large face-capping ligands, [18]annulene, is designed and investigated. The large interaction energy and HOMO-LUMO gap suggest that this novel charge-separated complex is highly stable and may be experimentally synthesized. In addition, the stability found in the [18]annulene-Li6-[18]annulene complex extends to multidecker sandwich clusters (Li6)n([18]annulene)n+1 (n = 2-3). The energy gain upon addition of a [18]annulene-Li6 unit to (Li6)n-1([18]annulene)n is pretty large (96.97-98.22 kcal/mol), indicating that even larger multideckers will also be very stable. Similar to ferrocene, such a hexanuclear sandwich complex could be considered as a versatile building block to find potential applications in different areas of chemistry, such as nanoscience and material science.
Development of Design Rules for Reliable Antisense RNA Behavior in E. coli.
Hoynes-O'Connor, Allison; Moon, Tae Seok
2016-12-16
A key driver of synthetic biology is the development of designable genetic parts with predictable behaviors that can be quickly implemented in complex genetic systems. However, the intrinsic complexity of gene regulation can make the rational design of genetic parts challenging. This challenge is apparent in the design of antisense RNA (asRNA) regulators. Though asRNAs are well-known regulators, the literature governing their design is conflicting and leaves the synthetic biology community without clear asRNA design rules. The goal of this study is to perform a comprehensive experimental characterization and statistical analysis of 121 unique asRNA regulators in order to resolve the conflicts that currently exist in the literature. asRNAs usually consist of two regions, the Hfq binding site and the target binding region (TBR). First, the behaviors of several high-performing Hfq binding sites were compared, in terms of their ability to improve repression efficiencies and their orthogonality. Next, a large-scale analysis of TBR design parameters identified asRNA length, the thermodynamics of asRNA-mRNA complex formation, and the percent of target mismatch as key parameters for TBR design. These parameters were used to develop simple asRNA design rules. Finally, these design rules were applied to construct both a simple and a complex genetic circuit containing different asRNAs, and predictable behavior was observed in both circuits. The results presented in this study will drive synthetic biology forward by providing useful design guidelines for the construction of asRNA regulators with predictable behaviors.
1988-05-01
ifforiable manpower investement. On the basis of our current experience it seems that the basic design principles are valid. The system developed will... system is operational on various computer networks, and in both industrial and in research environments. The design pri,lciples for the construction of...to a useful numerical simulation and design system for very complex configurations and flows. 7. REFERENCES 1. Bartlett G. W. , "An experimental
The use of experimental design to find the operating maximum power point of PEM fuel cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crăciunescu, Aurelian; Pătularu, Laurenţiu; Ciumbulea, Gloria
2015-03-10
Proton Exchange Membrane (PEM) Fuel Cells are difficult to model due to their complex nonlinear nature. In this paper, the development of a PEM Fuel Cells mathematical model based on the Design of Experiment methodology is described. The Design of Experiment provides a very efficient methodology to obtain a mathematical model for the studied multivariable system with only a few experiments. The obtained results can be used for optimization and control of the PEM Fuel Cells systems.
33 CFR 273.13 - Program policy.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Program is designed to deal primarily with weed infestations of major economic significance including... should constitute investigation of a specific problem weed or weed complex, not generalized surveys of... Control Program, except as such areas may be used for experimental purposes in research performed for the...
33 CFR 273.13 - Program policy.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Program is designed to deal primarily with weed infestations of major economic significance including... should constitute investigation of a specific problem weed or weed complex, not generalized surveys of... Control Program, except as such areas may be used for experimental purposes in research performed for the...
33 CFR 273.13 - Program policy.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Program is designed to deal primarily with weed infestations of major economic significance including... should constitute investigation of a specific problem weed or weed complex, not generalized surveys of... Control Program, except as such areas may be used for experimental purposes in research performed for the...
33 CFR 273.13 - Program policy.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Program is designed to deal primarily with weed infestations of major economic significance including... should constitute investigation of a specific problem weed or weed complex, not generalized surveys of... Control Program, except as such areas may be used for experimental purposes in research performed for the...
33 CFR 273.13 - Program policy.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Program is designed to deal primarily with weed infestations of major economic significance including... should constitute investigation of a specific problem weed or weed complex, not generalized surveys of... Control Program, except as such areas may be used for experimental purposes in research performed for the...
A multiple-alignment based primer design algorithm for genetically highly variable DNA targets
2013-01-01
Background Primer design for highly variable DNA sequences is difficult, and experimental success requires attention to many interacting constraints. The advent of next-generation sequencing methods allows the investigation of rare variants otherwise hidden deep in large populations, but requires attention to population diversity and primer localization in relatively conserved regions, in addition to recognized constraints typically considered in primer design. Results Design constraints include degenerate sites to maximize population coverage, matching of melting temperatures, optimizing de novo sequence length, finding optimal bio-barcodes to allow efficient downstream analyses, and minimizing risk of dimerization. To facilitate primer design addressing these and other constraints, we created a novel computer program (PrimerDesign) that automates this complex procedure. We show its powers and limitations and give examples of successful designs for the analysis of HIV-1 populations. Conclusions PrimerDesign is useful for researchers who want to design DNA primers and probes for analyzing highly variable DNA populations. It can be used to design primers for PCR, RT-PCR, Sanger sequencing, next-generation sequencing, and other experimental protocols targeting highly variable DNA samples. PMID:23965160
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirkov, Leonid; Makarewicz, Jan, E-mail: jama@amu.edu.pl
An ab initio intermolecular potential energy surface (PES) has been constructed for the benzene-krypton (BKr) van der Waals (vdW) complex. The interaction energy has been calculated at the coupled cluster level of theory with single, double, and perturbatively included triple excitations using different basis sets. As a result, a few analytical PESs of the complex have been determined. They allowed a prediction of the complex structure and its vibrational vdW states. The vibrational energy level pattern exhibits a distinct polyad structure. Comparison of the equilibrium structure, the dipole moment, and vibrational levels of BKr with their experimental counterparts has allowedmore » us to design an optimal basis set composed of a small Dunning’s basis set for the benzene monomer, a larger effective core potential adapted basis set for Kr and additional midbond functions. Such a basis set yields vibrational energy levels that agree very well with the experimental ones as well as with those calculated from the available empirical PES derived from the microwave spectra of the BKr complex. The basis proposed can be applied to larger complexes including Kr because of a reasonable computational cost and accurate results.« less
NASA Astrophysics Data System (ADS)
Balaji, P. A.
1999-07-01
A cricket's ear is a directional acoustic sensor. It has a remarkable level of sensitivity to the direction of sound propagation in a narrow frequency bandwidth of 4-5 KHz. Because of its complexity, the directional sensitivity has long intrigued researchers. The cricket's ear is a four-acoustic-inputs/two-vibration-outputs system. In this dissertation, this system is examined in depth, both experimentally and theoretically, with a primary goal to understand the mechanics involved in directional hearing. Experimental identification of the system is done by using random signal processing techniques. Theoretical identification of the system is accomplished by analyzing sound transmission through complex trachea of the ear. Finally, a description of how the cricket achieves directional hearing sensitivity is proposed. The fundamental principle involved in directional heating of the cricket has been utilized to design a device to obtain a directional signal from non- directional inputs.
Boudreaux, Edwin D; Miller, Ivan; Goldstein, Amy B; Sullivan, Ashley F; Allen, Michael H; Manton, Anne P; Arias, Sarah A; Camargo, Carlos A
2013-09-01
Due to the concentration of individuals at-risk for suicide, an emergency department visit represents an opportune time for suicide risk screening and intervention. The Emergency Department Safety Assessment and Follow-up Evaluation (ED-SAFE) uses a quasi-experimental, interrupted time series design to evaluate whether (1) a practical approach to universally screening ED patients for suicide risk leads to improved detection of suicide risk and (2) a multi-component intervention delivered during and after the ED visit improves suicide-related outcomes. This paper summarizes the ED-SAFE's study design and methods within the context of considerations relevant to effectiveness research in suicide prevention and pertinent human participants concerns. 1440 suicidal individuals, from 8 general ED's nationally will be enrolled during three sequential phases of data collection (480 individuals/phase): (1) Treatment as Usual; (2) Universal Screening; and (3) Intervention. Data from the three phases will inform two separate evaluations: Screening Outcome (Phases 1 and 2) and Intervention (Phases 2 and 3). Individuals will be followed for 12 months. The primary study outcome is a composite reflecting completed suicide, attempted suicide, aborted or interrupted attempts, and implementation of rescue procedures during an outcome assessment. While 'classic' randomized control trials (RCT) are typically selected over quasi-experimental designs, ethical and methodological issues may make an RCT a poor fit for complex interventions in an applied setting, such as the ED. ED-SAFE represents an innovative approach to examining the complex public health issue of suicide prevention through a multi-phase, quasi-experimental design embedded in 'real world' clinical settings. Copyright © 2013 Elsevier Inc. All rights reserved.
Big insights from small volumes: deciphering complex leukocyte behaviors using microfluidics
Irimia, Daniel; Ellett, Felix
2016-01-01
Inflammation is an indispensable component of the immune response, and leukocytes provide the first line of defense against infection. Although the major stereotypic leukocyte behaviors in response to infection are well known, the complexities and idiosyncrasies of these phenotypes in conditions of disease are still emerging. Novel tools are indispensable for gaining insights into leukocyte behavior, and in the past decade, microfluidic technologies have emerged as an exciting development in the field. Microfluidic devices are readily customizable, provide tight control of experimental conditions, enable high precision of ex vivo measurements of individual as well as integrated leukocyte functions, and have facilitated the discovery of novel leukocyte phenotypes. Here, we review some of the most interesting insights resulting from the application of microfluidic approaches to the study of the inflammatory response. The aim is to encourage leukocyte biologists to integrate these new tools into increasingly more sophisticated experimental designs for probing complex leukocyte functions. PMID:27194799
Design and Evaluation of Complex Moving HIFU Treatment Protocols
NASA Astrophysics Data System (ADS)
Kargl, Steven G.; Andrew, Marilee A.; Kaczkowski, Peter J.; Brayman, Andrew A.; Crum, Lawrence A.
2005-03-01
The use of moving high-intensity focused ultrasound (HIFU) treatment protocols is of interest in achieving efficient formation of large-volume thermal lesions in tissue. Judicious protocol design is critical in order to avoid collateral damage to healthy tissues outside the treatment zone. A KZK-BHTE model, extended to simulate multiple, moving scans in tissue, is used to investigate protocol design considerations. Prediction and experimental observations are presented which 1) validate the model, 2) illustrate how to assess the effects of acoustic nonlinearity, and 3) demonstrate how to assess and control collateral damage such as prefocal lesion formation and lesion formation resulting from thermal conduction without direct HIFU exposure. Experimental data consist of linear and circular scan protocols delivered over a range of exposure regimes in ex vivo bovine liver.
Failure behavior of generic metallic and composite aircraft structural components under crash loads
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Robinson, Martha P.
1990-01-01
Failure behavior results are presented from crash dynamics research using concepts of aircraft elements and substructure not necessarily designed or optimized for energy absorption or crash loading considerations. To achieve desired new designs incorporating improved energy absorption capabilities often requires an understanding of how more conventional designs behave under crash loadings. Experimental and analytical data are presented which indicate some general trends in the failure behavior of a class of composite structures including individual fuselage frames, skeleton subfloors with stringers and floor beams without skin covering, and subfloors with skin added to the frame-stringer arrangement. Although the behavior is complex, a strong similarity in the static/dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models.
Prediction of physical protein protein interactions
NASA Astrophysics Data System (ADS)
Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey
2005-06-01
Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...
2017-12-27
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less
NASA Astrophysics Data System (ADS)
Ribeiro, Eduardo Afonso; Lopes, Eduardo Márcio de Oliveira; Bavastri, Carlos Alberto
2017-12-01
Viscoelastic materials have played an important role in passive vibration control. Nevertheless, the use of such materials in supports of rotating machines, aiming at controlling vibration, is more recent, mainly when these supports present additional complexities like multiple degrees of freedom and require accurate models to predict the dynamic behavior of viscoelastic materials working in a broad band of frequencies and temperatures. Previously, the authors propose a methodology for an optimal design of viscoelastic supports (VES) for vibration suppression in rotordynamics, which improves the dynamic prediction accuracy, the speed calculation, and the modeling of VES as complex structures. However, a comprehensive numerical study of the dynamics of rotor-VES systems, regarding the types and combinations of translational and rotational degrees of freedom (DOFs), accompanied by the corresponding experimental validation, is still lacking. This paper presents such a study considering different types and combinations of DOFs in addition to the simulation of their number of additional masses/inertias, as well as the kind and association of the applied viscoelastic materials (VEMs). The results - regarding unbalance frequency response, transmissibility and displacement due to static loads - lead to: 1) considering VES as complex structures which allow improving the efficacy in passive vibration control; 2) acknowledging the best configuration concerning DOFs and VEM choice and association for a practical application concerning passive vibration control and load resistance. The specific outcomes of the conducted experimental validation attest the accuracy of the proposed methodology.
When Theater Comes to Engineering Design: Oh How Creative They Can Be.
Pfeiffer, Ferris M; Bauer, Rachel E; Borgelt, Steve; Burgoyne, Suzanne; Grant, Sheila; Hunt, Heather K; Pardoe, Jennie J; Schmidt, David C
2017-07-01
The creative process is fun, complex, and sometimes frustrating, but it is critical to the future of our nation and progress in science, technology, engineering, mathematics (STEM), as well as other fields. Thus, we set out to see if implementing methods of active learning typical to the theater department could impact the creativity of senior capstone design students in the bioengineering (BE) department. Senior bioengineering capstone design students were allowed to self-select into groups. Prior to the beginning of coursework, all students completed a validated survey measuring engineering design self-efficacy. The control and experimental groups both received standard instruction, but in addition the experimental group received 1 h per week of creativity training developed by a theater professor. Following the semester, the students again completed the self-efficacy survey. The surveys were examined to identify differences in the initial and final self-efficacy in the experimental and control groups over the course of the semester. An analysis of variance was used to compare the experimental and control groups with p < 0.05 considered significant. Students in the experimental group reported more than a twofold (4.8 (C) versus 10.9 (E)) increase of confidence. Additionally, students in the experimental group were more motivated and less anxious when engaging in engineering design following the semester of creativity instruction. The results of this pilot study indicate that there is a significant potential to improve engineering students' creative self-efficacy through the implementation of a "curriculum of creativity" which is developed using theater methods.
Experimental Design for Parameter Estimation of Gene Regulatory Networks
Timmer, Jens
2012-01-01
Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723
Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim
2018-01-05
The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3×10 -3 molL -1 and 700μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15×10 -2 molL -1 and 2.8mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R 2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10 -7 and 10 -8 molL -1 , respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R 2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Okhio, Cyril B.
1995-01-01
A theoretical and an experimental design study of subsonic flow through curved-wall annular diffusers is being carried out in order to establish the most pertinent design parameters for such devices and the implications of their application in the design of engine components in the aerospace industries. This investigation consists of solving numerically the full Navier Stokes and Continuity equations for the time-mean flow. Various models of turbulence are being evaluated for adoption throughout the study and comparisons would be made with experimental data where they exist. Assessment of diffuser performance based on the dissipated mechanical energy would also be made. The experimental work involves the application of Computer Aided Design software tool to the development of a suitable annular diffuser geometry and the subsequent downloading of such data to a CNC machine at Central State University. The results of the investigations are expected to indicate that more cost effective component design of such devices as effective component design of such devices as diffusers which normally contain complex flows can still be achieved. In this regard a review paper was accepted and presented at the First International Conference on High Speed Civil Transportation Research held at North Carolina A&T in December of 1994.
Khan, Imtiaz A; Fraser, Adam; Bray, Mark-Anthony; Smith, Paul J; White, Nick S; Carpenter, Anne E; Errington, Rachel J
2014-12-01
Experimental reproducibility is fundamental to the progress of science. Irreproducible research decreases the efficiency of basic biological research and drug discovery and impedes experimental data reuse. A major contributing factor to irreproducibility is difficulty in interpreting complex experimental methodologies and designs from written text and in assessing variations among different experiments. Current bioinformatics initiatives either are focused on computational research reproducibility (i.e. data analysis) or laboratory information management systems. Here, we present a software tool, ProtocolNavigator, which addresses the largely overlooked challenges of interpretation and assessment. It provides a biologist-friendly open-source emulation-based tool for designing, documenting and reproducing biological experiments. ProtocolNavigator was implemented in Python 2.7, using the wx module to build the graphical user interface. It is a platform-independent software and freely available from http://protocolnavigator.org/index.html under the GPL v2 license. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Active vibration control with model correction on a flexible laboratory grid structure
NASA Technical Reports Server (NTRS)
Schamel, George C., II; Haftka, Raphael T.
1991-01-01
This paper presents experimental and computational comparisons of three active damping control laws applied to a complex laboratory structure. Two reduced structural models were used with one model being corrected on the basis of measured mode shapes and frequencies. Three control laws were investigated, a time-invariant linear quadratic regulator with state estimation and two direct rate feedback control laws. Experimental results for all designs were obtained with digital implementation. It was found that model correction improved the agreement between analytical and experimental results. The best agreement was obtained with the simplest direct rate feedback control.
High Level Analysis, Design and Validation of Distributed Mobile Systems with
NASA Astrophysics Data System (ADS)
Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.
System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.
NASA Astrophysics Data System (ADS)
Daravath, Sreenu; Kumar, Marri Pradeep; Rambabu, Aveli; Vamsikrishna, Narendrula; Ganji, Nirmala; Shivaraj
2017-09-01
Two novel Schiff bases, L1 = (2-benzo[d]thiazol-6-ylimino)methyl)-4,6-dichlorophenol), L2 = (1-benzo[d]thiazol-6-ylimino)methyl)-6-bromo-4-chlorophenol) and their bivalent transition metal complexes [M(L1)2] and [M(L2)2], where M = Cu(II), Co(II) and Ni(II) were synthesized and characterized by elemental analysis, NMR, IR, UV-visible, mass, magnetic moments, ESR, TGA, SEM, EDX and powder XRD. Based on the experimental data a square planar geometry around the metal ion is assigned to all the complexes (1a-2c). The interaction of synthesized metal complexes with calf thymus DNA was explored using UV-visible absorption spectra, fluorescence and viscosity measurements. The experimental evidence indicated that all the metal complexes strongly bound to CT-DNA through an intercalation mode. DNA cleavage experiments of metal(II) complexes with supercoiled pBR322 DNA have also been explored by gel electrophoresis in the presence of H2O2 as well as UV light, and it is found that the Cu(II) complexes cleaved DNA more effectively compared to Co(II), Ni(II) complexes. In addition, the ligands and their metal complexes were screened for antimicrobial activity and it is found that all the metal complexes were more potent than free ligands.
Shi, Zhenyu; Vickers, Claudia E
2016-12-01
Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.
Adsorption of saturated fatty acid in urea complexation: Kinetics and equilibrium studies
NASA Astrophysics Data System (ADS)
Setyawardhani, Dwi Ardiana; Sulistyo, Hary; Sediawan, Wahyudi Budi; Fahrurrozi, Mohammad
2018-02-01
Urea complexation is fractionation process for concentrating poly-unsaturated fatty acids (PUFAs) from vegetable oil or animal fats. For process design and optimization in commercial industries, it is necessary to provide kinetics and equilibrium data. Urea inclusion compounds (UICs) as the product is a unique complex form which one molecule (guest) is enclosed within another molecule (host). In urea complexation, the guest-host bonding exists between saturated fatty acids (SFAs) and crystalline urea. This research studied the complexation is analogous to an adsorption process. The Batch adsorption process was developed to obtain the experimental data. The ethanolic urea solution was mixed with SFA in certain compositions and adsorption times. The mixture was heated until it formed homogenous and clear solution, then it cooled very slowly until the first numerous crystal appeared. Adsorption times for the kinetic data were determined since the crystal formed. The temperature was maintained constant at room temperature. Experimental sets of data were observed with adsorption kinetics and equilibrium models. High concentration of saturated fatty acid (SFA) was used to represent adsorption kinetics and equilibrium parameters. Kinetic data were examined with pseudo first-order, pseudo second-order and intra particle diffusion models. Linier, Freundlich and Langmuir isotherm were used to study the equilibrium model of this adsorption. The experimental data showed that SFA adsorption in urea crystal followed pseudo second-order model. The compatibility of the data with Langmuir isotherm showed that urea complexation was a monolayer adsorption.
On designing low pressure loss working spaces for a planar Stirling micromachine
NASA Astrophysics Data System (ADS)
Hachey, M.-A.; Léveillé, É.; Fréchette, L. G.; Formosa, F.
2015-12-01
In this paper, research was undertaken with the objective to design low pressure loss working spaces for a Stirling cycle micro heat engine operating from low temperature waste heat. This planar free-piston heat engine is anticipated to operate at the kHz level with mm3 displacement. Given the resonant nature of the free-piston configuration, the complexity of its working gas’ flow geometry and its projected high operating frequency, flow analysis is relatively complex. Design considerations were thus based on fast prototyping and experimentation. Results show that geometrical features, such as a sharp 90° corner between the regenerator and working spaces, are strong contributors to pressure losses. This research culminated into a promising revised working space configuration for engine start-up, as it considerably reduced total pressure losses, more than 80% at Re = 700, from the original design.
de Vlaming, Rianne; Haveman-Nies, Annemien; Van't Veer, Pieter; de Groot, Lisette Cpgm
2010-09-13
The aim of this paper is to provide the rationale for an evaluation design for a complex intervention program targeting loneliness among non-institutionalized elderly people in a Dutch community. Complex public health interventions characteristically use the combined approach of intervening on the individual and on the environmental level. It is assumed that the components of a complex intervention interact with and reinforce each other. Furthermore, implementation is highly context-specific and its impact is influenced by external factors. Although the entire community is exposed to the intervention components, each individual is exposed to different components with a different intensity. A logic model of change is used to develop the evaluation design. The model describes what outcomes may logically be expected at different points in time at the individual level. In order to address the complexity of a real-life setting, the evaluation design of the loneliness intervention comprises two types of evaluation studies. The first uses a quasi-experimental pre-test post-test design to evaluate the effectiveness of the overall intervention. A control community comparable to the intervention community was selected, with baseline measurements in 2008 and follow-up measurements scheduled for 2010. This study focuses on changes in the prevalence of loneliness and in the determinants of loneliness within individuals in the general elderly population. Complementarily, the second study is designed to evaluate the individual intervention components and focuses on delivery, reach, acceptance, and short-term outcomes. Different means of project records and surveys among participants are used to collect these data. Combining these two evaluation strategies has the potential to assess the effectiveness of the overall complex intervention and the contribution of the individual intervention components thereto.
An experimental investigation of the effects of alarm processing and display on operator performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Hara, J.; Brown, W.; Hallbert, B.
1998-03-01
This paper describes a research program sponsored by the US Nuclear Regulatory Commission to address the human factors engineering (HFE) aspects of nuclear power plant alarm systems. The overall objective of the program is to develop HFE review guidance for advanced alarm systems. As part of this program, guidance has been developed based on a broad base of technical and research literature. In the course of guidance development, aspects of alarm system design for which the technical basis was insufficient to support complete guidance development were identified. The primary purpose of the research reported in this paper was to evaluatemore » the effects of three of these alarm system design characteristics on operator performance in order to contribute to the understanding of potential safety issues and to provide data to support the development of design review guidance in these areas. Three alarm system design characteristics studied were (1) alarm processing (degree of alarm reduction), (2) alarm availability (dynamic prioritization and suppression), and (3) alarm display (a dedicated tile format, a mixed tile and message list format, and a format in which alarm information is integrated into the process displays). A secondary purpose was to provide confirmatory evidence of selected alarm system guidance developed in an earlier phase of the project. The alarm characteristics were combined into eight separate experimental conditions. Six, two-person crews of professional nuclear power plant operators participated in the study. Following training, each crew completed 16 test trials which consisted of two trials in each of the eight experimental conditions (one with a low-complexity scenario and one with a high-complexity scenario). Measures of process performance, operator task performance, situation awareness, and workload were obtained. In addition, operator opinions and evaluations of the alarm processing and display conditions were collected. No deficient performance was observed in any of the experimental conditions, providing confirmatory support for many design review guidelines. The operators identified numerous strengths and weaknesses associated with individual alarm design characteristics.« less
Graça, Cátia A L; Correia de Velosa, Adriana; Teixeira, Antonio Carlos S C
2017-10-01
Photochemical redox reactions of Fe(III) complexes in surface waters are important sources of radical species, therefore contributing to the sunlight-driven elimination of waterborne recalcitrant contaminants. In this study, the effects of three Fe(III)-carboxylates (i.e., oxalate, citrate, and tartrate) on the UVA photoinduced oxidation of the herbicide amicarbazone (AMZ) were investigated. A Doehlert experimental design was applied to find the Fe(III):ligand ratios and pH that achieved the fastest AMZ degradation rate. The results indicated optimal ratios of 1:10 (Fe(III):oxalate), 1:4 (Fe(III):citrate), and 1:1 (Fe(III):tartrate), with the [Fe(III)] 0 set at 0.1 mmol L -1 and the best pH found to be 3.5 for all the complexes. In addition, a statistical model that predicts the observed degradation rate constant (k obs ) as a function of pH and Fe(III):carboxylate ratio was obtained for each complex, enabling AMZ-photodegradation predictions based on these two variables. To the best of our knowledge, this is the first time that such models are proposed. Not only the pH-dependent speciation of Fe(III) in solution but also the time profiles of photogenerated OH, Fe(II), and H 2 O 2 gave appropriate support to the experimental results. Additional experiments using a sampled sewage treatment plant effluent suggest that the addition of aqua and/or Fe(III)-oxalate complexes to the matrix may also be effective for AMZ removal from natural waters in case their natural occurrence is not high enough to promote pollutant degradation. Therefore, the inclusion of Fe(III)-complexes in investigations dealing with the environmental fate of emerging pollutants in natural waterbodies is strongly recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.
Energy design for protein-protein interactions
Ravikant, D. V. S.; Elber, Ron
2011-01-01
Proteins bind to other proteins efficiently and specifically to carry on many cell functions such as signaling, activation, transport, enzymatic reactions, and more. To determine the geometry and strength of binding of a protein pair, an energy function is required. An algorithm to design an optimal energy function, based on empirical data of protein complexes, is proposed and applied. Emphasis is made on negative design in which incorrect geometries are presented to the algorithm that learns to avoid them. For the docking problem the search for plausible geometries can be performed exhaustively. The possible geometries of the complex are generated on a grid with the help of a fast Fourier transform algorithm. A novel formulation of negative design makes it possible to investigate iteratively hundreds of millions of negative examples while monotonically improving the quality of the potential. Experimental structures for 640 protein complexes are used to generate positive and negative examples for learning parameters. The algorithm designed in this work finds the correct binding structure as the lowest energy minimum in 318 cases of the 640 examples. Further benchmarks on independent sets confirm the significant capacity of the scoring function to recognize correct modes of interactions. PMID:21842951
USDA-ARS?s Scientific Manuscript database
One of the challenges of hydraulic experimentation is designing experiments that are complex enough to capture relevant processes while retaining the simplicity necessary for useful, accurate measurements. The intricacy of the interactions between turbulent flows and mobile beds in rivers and stream...
ERIC Educational Resources Information Center
Lee, Hyeon Woo
2011-01-01
As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…
Simulations for the Test Flight of an Experimental HALE Aircraft
2011-06-01
as a plant representation for HALE aircraft control design. It focuses on a reduced number of states to represent the complex nonlinear problem...Atkins, Ella M., Shearer, Christopher M. and Nathan A. Pitcher . “X-HALE: A Very Flexible UAV for Nonlinear Aeroelastic Tests.” (AIAA 2010-2715), April
The Interplay of News Frames on Cognitive Complexity
ERIC Educational Resources Information Center
Shah, Dhavan V.; Kwak, Nojin; Schmierbach, Mike; Zubric, Jessica
2004-01-01
This research considers how distinct news frames work in combination to influence information processing. It extends framing research grounded in prospect theory (Tversky & Kahneman, 1981) and attribution theory (Iyengar, 1991) to study conditional framing effects on associative memory. Using a 2 x 3 experimental design embedded within a…
Practical aspects of running DOE for improving growth media for in vitro plants
USDA-ARS?s Scientific Manuscript database
Experiments using DOE software to improve plant tissue culture growth medium are complicated and require complex setups. Once the experimental design is set and the treatment points calculated, media sheets and mixing charts must be developed. Since these experiments require three passages on the sa...
Designing, Implementing and Evaluating Preclinical Simulation Lab for Maternity Nursing Course
ERIC Educational Resources Information Center
ALFozan, Haya; El Sayed, Yousria; Habib, Farida
2015-01-01
Background: The opportunity for students to deliver care safely in today's, complex health care environment is limited. Simulation allows students to practice skills in a safe environment. Purpose: to assess the students' perception, satisfaction, and learning outcomes after a simulation based maternity course. Method: a quasi experimental design…
Greased Lightning (GL-10) Performance Flight Research: Flight Data Report
NASA Technical Reports Server (NTRS)
McSwain, Robert G.; Glaab, Louis J.; Theodore, Colin R.; Rhew, Ray D. (Editor); North, David D. (Editor)
2017-01-01
Modern aircraft design methods have produced acceptable designs for large conventional aircraft performance. With revolutionary electronic propulsion technologies fueled by the growth in the small UAS (Unmanned Aerial Systems) industry, these same prediction models are being applied to new smaller, and experimental design concepts requiring a VTOL (Vertical Take Off and Landing) capability for ODM (On Demand Mobility). A 50% sub-scale GL-10 flight model was built and tested to demonstrate the transition from hover to forward flight utilizing DEP (Distributed Electric Propulsion)[1][2]. In 2016 plans were put in place to conduct performance flight testing on the 50% sub-scale GL-10 flight model to support a NASA project called DELIVER (Design Environment for Novel Vertical Lift Vehicles). DELIVER was investigating the feasibility of including smaller and more experimental aircraft configurations into a NASA design tool called NDARC (NASA Design and Analysis of Rotorcraft)[3]. This report covers the performance flight data collected during flight testing of the GL-10 50% sub-scale flight model conducted at Beaver Dam Airpark, VA. Overall the flight test data provides great insight into how well our existing conceptual design tools predict the performance of small scale experimental DEP concepts. Low fidelity conceptual design tools estimated the (L/D)( sub max)of the GL-10 50% sub-scale flight model to be 16. Experimentally measured (L/D)( sub max) for the GL-10 50% scale flight model was 7.2. The aerodynamic performance predicted versus measured highlights the complexity of wing and nacelle interactions which is not currently accounted for in existing low fidelity tools.
Okumu, Fredros O.; Moore, Jason; Mbeyela, Edgar; Sherlock, Mark; Sangusangu, Robert; Ligamba, Godfrey; Russell, Tanya; Moore, Sarah J.
2012-01-01
Differences between individual human houses can confound results of studies aimed at evaluating indoor vector control interventions such as insecticide treated nets (ITNs) and indoor residual insecticide spraying (IRS). Specially designed and standardised experimental huts have historically provided a solution to this challenge, with an added advantage that they can be fitted with special interception traps to sample entering or exiting mosquitoes. However, many of these experimental hut designs have a number of limitations, for example: 1) inability to sample mosquitoes on all sides of huts, 2) increased likelihood of live mosquitoes flying out of the huts, leaving mainly dead ones, 3) difficulties of cleaning the huts when a new insecticide is to be tested, and 4) the generally small size of the experimental huts, which can misrepresent actual local house sizes or airflow dynamics in the local houses. Here, we describe a modified experimental hut design - The Ifakara Experimental Huts- and explain how these huts can be used to more realistically monitor behavioural and physiological responses of wild, free-flying disease-transmitting mosquitoes, including the African malaria vectors of the species complexes Anopheles gambiae and Anopheles funestus, to indoor vector control-technologies including ITNs and IRS. Important characteristics of the Ifakara experimental huts include: 1) interception traps fitted onto eave spaces and windows, 2) use of eave baffles (panels that direct mosquito movement) to control exit of live mosquitoes through the eave spaces, 3) use of replaceable wall panels and ceilings, which allow safe insecticide disposal and reuse of the huts to test different insecticides in successive periods, 4) the kit format of the huts allowing portability and 5) an improved suite of entomological procedures to maximise data quality. PMID:22347415
Stated Choice design comparison in a developing country: recall and attribute nonattendance
2014-01-01
Background Experimental designs constitute a vital component of all Stated Choice (aka discrete choice experiment) studies. However, there exists limited empirical evaluation of the statistical benefits of Stated Choice (SC) experimental designs that employ non-zero prior estimates in constructing non-orthogonal constrained designs. This paper statistically compares the performance of contrasting SC experimental designs. In so doing, the effect of respondent literacy on patterns of Attribute non-Attendance (ANA) across fractional factorial orthogonal and efficient designs is also evaluated. The study uses a ‘real’ SC design to model consumer choice of primary health care providers in rural north India. A total of 623 respondents were sampled across four villages in Uttar Pradesh, India. Methods Comparison of orthogonal and efficient SC experimental designs is based on several measures. Appropriate comparison of each design’s respective efficiency measure is made using D-error results. Standardised Akaike Information Criteria are compared between designs and across recall periods. Comparisons control for stated and inferred ANA. Coefficient and standard error estimates are also compared. Results The added complexity of the efficient SC design, theorised elsewhere, is reflected in higher estimated amounts of ANA among illiterate respondents. However, controlling for ANA using stated and inferred methods consistently shows that the efficient design performs statistically better. Modelling SC data from the orthogonal and efficient design shows that model-fit of the efficient design outperform the orthogonal design when using a 14-day recall period. The performance of the orthogonal design, with respect to standardised AIC model-fit, is better when longer recall periods of 30-days, 6-months and 12-months are used. Conclusions The effect of the efficient design’s cognitive demand is apparent among literate and illiterate respondents, although, more pronounced among illiterate respondents. This study empirically confirms that relaxing the orthogonality constraint of SC experimental designs increases the information collected in choice tasks, subject to the accuracy of the non-zero priors in the design and the correct specification of a ‘real’ SC recall period. PMID:25386388
Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.
Lyons, Rhonda
2012-01-01
According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.
NASA Astrophysics Data System (ADS)
Henry, Christine; Kramb, Victoria; Welter, John T.; Wertz, John N.; Lindgren, Eric A.; Aldrin, John C.; Zainey, David
2018-04-01
Advances in NDE method development are greatly improved through model-guided experimentation. In the case of ultrasonic inspections, models which provide insight into complex mode conversion processes and sound propagation paths are essential for understanding the experimental data and inverting the experimental data into relevant information. However, models must also be verified using experimental data obtained under well-documented and understood conditions. Ideally, researchers would utilize the model simulations and experimental approach to efficiently converge on the optimal solution. However, variability in experimental parameters introduce extraneous signals that are difficult to differentiate from the anticipated response. This paper discusses the results of an ultrasonic experiment designed to evaluate the effect of controllable variables on the anticipated signal, and the effect of unaccounted for experimental variables on the uncertainty in those results. Controlled experimental parameters include the transducer frequency, incidence beam angle and focal depth.
Control of complex physically simulated robot groups
NASA Astrophysics Data System (ADS)
Brogan, David C.
2001-10-01
Actuated systems such as robots take many forms and sizes but each requires solving the difficult task of utilizing available control inputs to accomplish desired system performance. Coordinated groups of robots provide the opportunity to accomplish more complex tasks, to adapt to changing environmental conditions, and to survive individual failures. Similarly, groups of simulated robots, represented as graphical characters, can test the design of experimental scenarios and provide autonomous interactive counterparts for video games. The complexity of writing control algorithms for these groups currently hinders their use. A combination of biologically inspired heuristics, search strategies, and optimization techniques serve to reduce the complexity of controlling these real and simulated characters and to provide computationally feasible solutions.
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Jiang, Jianfei; Bakan, Ahmet; Kapralov, Alexandr A.; Silva, K. Ishara; Huang, Zhentai; Amoscato, Andrew A.; Peterson, James; Garapati, Venkata Krishna; Saxena, Sunil; Bayir, Hülya; Atkinson, Jeffrey; Bahar, Ivet; Kagan, Valerian E.
2014-01-01
Mitochondria have emerged as the major regulatory platform responsible for coordination of numerous metabolic reactions as well as cell death processes, whereby the execution of intrinsic apoptosis includes the production of reactive oxygen species fueling oxidation of cardiolipin (CL) catalyzed by cytochrome (cyt) c. As this oxidation occurs within the peroxidase complex of cyt c with CL, the latter represents a promising target for the discovery and design of drugs with anti-apoptotic mechanism of action. In this work, we designed and synthesized a new group of mitochondria-targeted imidazole-substituted analogues of stearic acid TPP-n-ISA with different positions of the attached imidazole group on the fatty acid (n=6, 8, 10, 13 and 14). By using a combination of absorption spectroscopy and EPR protocols (continuous wave electron paramagnetic resonance, and electron spin echo envelope modulation) we demonstrated that TPP-n-ISA indeed were able to potently suppress CL induced structural re-arrangements in cyt c paving the way to its peroxidase competence. TPP-n-ISA analogues preserved the low spin hexa-coordinated heme iron state in cyt c/CL complexes whereby TPP-6-ISA displayed a significantly more effective preservation pattern than TPP-14-ISA. Elucidation of these intermolecular stabilization mechanisms of cyt c identified TPP-6-ISA as an effective inhibitor of the peroxidase function of cyt c/CL complexes with a significant anti-apoptotic potential realized in mouse embryonic cells exposed to ionizing irradiation. These experimental findings were detailed and supported by all atom molecular dynamics simulations. Based on the experimental data and computations predictions, we identified TPP-6-ISA as a candidate drug with optimized anti-apoptotic potency. PMID:24631490
Jiang, Jianfei; Bakan, Ahmet; Kapralov, Alexandr A; Silva, K Ishara; Huang, Zhentai; Amoscato, Andrew A; Peterson, James; Garapati, Venkata Krishna; Saxena, Sunil; Bayir, Hülya; Atkinson, Jeffrey; Bahar, Ivet; Kagan, Valerian E
2014-06-01
Mitochondria have emerged as the major regulatory platform responsible for the coordination of numerous metabolic reactions as well as cell death processes, whereby the execution of intrinsic apoptosis includes the production of reactive oxygen species fueling oxidation of cardiolipin (CL) catalyzed by cytochrome (Cyt) c. As this oxidation occurs within the peroxidase complex of Cyt c with CL, the latter represents a promising target for the discovery and design of drugs with antiapoptotic mechanisms of action. In this work, we designed and synthesized a new group of mitochondria-targeted imidazole-substituted analogs of stearic acid TPP-n-ISAs with various positions of the attached imidazole group on the fatty acid (n = 6, 8, 10, 13, and 14). By using a combination of absorption spectroscopy and EPR protocols (continuous wave electron paramagnetic resonance and electron spin echo envelope modulation) we demonstrated that TPP-n-ISAs indeed were able to potently suppress CL-induced structural rearrangements in Cyt c, paving the way to its peroxidase competence. TPP-n-ISA analogs preserved the low-spin hexa-coordinated heme-iron state in Cyt c/CL complexes whereby TPP-6-ISA displayed a significantly more effective preservation pattern than TPP-14-ISA. Elucidation of these intermolecular stabilization mechanisms of Cyt c identified TPP-6-ISA as an effective inhibitor of the peroxidase function of Cyt c/CL complexes with a significant antiapoptotic potential realized in mouse embryonic cells exposed to ionizing irradiation. These experimental findings were detailed and supported by all-atom molecular dynamics simulations. Based on the experimental data and computation predictions, we identified TPP-6-ISA as a candidate drug with optimized antiapoptotic potency. Copyright © 2014 Elsevier Inc. All rights reserved.
Engineering Design of Safe Automobile Front Strut Tower Brace with Predetermined Destruction
NASA Astrophysics Data System (ADS)
Mironenko, R. Ye; Balaev, E. Yu; Blednova, Zh M.
2018-03-01
This paper shows the developed design of an automobile front strut tower brace instantly breakable on reaching a predetermined value impact load, which allows the impact load not to be transferred to the opposite strut. An automobile front strut tower brace with the directed destruction V-shaped element using the SolidWorks and SolidWorks Simulations software complex was developed, designed and analyzed. The obtained data were confirmed experimentally. By changing geometric features of the V-shaped element, it is possible to change the impact load value required for its destruction.
A computer simulator for development of engineering system design methodologies
NASA Technical Reports Server (NTRS)
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Water facilities in retrospect and prospect: An illuminating tool for vehicle design
NASA Technical Reports Server (NTRS)
Erickson, G. E.; Peak, D. J.; Delfrate, J.; Skow, A. M.; Malcolm, G. N.
1986-01-01
Water facilities play a fundamental role in the design of air, ground, and marine vehicles by providing a qualitative, and sometimes quantitative, description of complex flow phenomena. Water tunnels, channels, and tow tanks used as flow-diagnostic tools have experienced a renaissance in recent years in response to the increased complexity of designs suitable for advanced technology vehicles. These vehicles are frequently characterized by large regions of steady and unsteady three-dimensional flow separation and ensuing vortical flows. The visualization and interpretation of the complicated fluid motions about isolated vehicle components and complete configurations in a time and cost effective manner in hydrodynamic test facilities is a key element in the development of flow control concepts, and, hence, improved vehicle designs. A historical perspective of the role of water facilities in the vehicle design process is presented. The application of water facilities to specific aerodynamic and hydrodynamic flow problems is discussed, and the strengths and limitations of these important experimental tools are emphasized.
NASA Technical Reports Server (NTRS)
Haynes, Davy A.; Miller, David S.; Klein, John R.; Louie, Check M.
1988-01-01
A method by which a simple equivalent faired body can be designed to replace a more complex body with flowing inlets has been demonstrated for supersonic flow. An analytically defined, geometrically simple faired inlet forebody has been designed using a linear potential code to generate flow perturbations equivalent to those produced by a much more complex forebody with inlets. An equivalent forebody wind-tunnel model was fabricated and a test was conducted in NASA Langley Research Center's Unitary Plan Wind Tunnel. The test Mach number range was 1.60 to 2.16 for angles of attack of -4 to 16 deg. Test results indicate that, for the purposes considered here, the equivalent forebody simulates the original flowfield disturbances to an acceptable degree of accuracy.
Castorena-Cortés, G; Roldán-Carrillo, T; Zapata-Peñasco, I; Reyes-Avila, J; Quej-Aké, L; Marín-Cruz, J; Olguín-Lora, P
2009-12-01
Microcosm assays and Taguchi experimental design was used to assess the biodegradation of an oil sludge produced by a gas processing unit. The study showed that the biodegradation of the sludge sample is feasible despite the high level of pollutants and complexity involved in the sludge. The physicochemical and microbiological characterization of the sludge revealed a high concentration of hydrocarbons (334,766+/-7001 mg kg(-1) dry matter, d.m.) containing a variety of compounds between 6 and 73 carbon atoms in their structure, whereas the concentration of Fe was 60,000 mg kg(-1) d.m. and 26,800 mg kg(-1) d.m. of sulfide. A Taguchi L(9) experimental design comprising 4 variables and 3 levels moisture, nitrogen source, surfactant concentration and oxidant agent was performed, proving that moisture and nitrogen source are the major variables that affect CO(2) production and total petroleum hydrocarbons (TPH) degradation. The best experimental treatment yielded a TPH removal of 56,092 mg kg(-1) d.m. The treatment was carried out under the following conditions: 70% moisture, no oxidant agent, 0.5% of surfactant and NH(4)Cl as nitrogen source.
Preprogramming Complex Hydrogel Responses using Enzymatic Reaction Networks.
Postma, Sjoerd G J; Vialshin, Ilia N; Gerritsen, Casper Y; Bao, Min; Huck, Wilhelm T S
2017-02-06
The creation of adaptive matter is heavily inspired by biological systems. However, it remains challenging to design complex material responses that are governed by reaction networks, which lie at the heart of cellular complexity. The main reason for this slow progress is the lack of a general strategy to integrate reaction networks with materials. Herein we use a systematic approach to preprogram the response of a hydrogel to a trigger, in this case the enzyme trypsin, which activates a reaction network embedded within the hydrogel. A full characterization of all the kinetic rate constants in the system enabled the construction of a computational model, which predicted different hydrogel responses depending on the input concentration of the trigger. The results of the simulation are in good agreement with experimental findings. Our methodology can be used to design new, adaptive materials of which the properties are governed by reaction networks of arbitrary complexity. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improving knowledge of garlic paste greening through the design of an experimental strategy.
Aguilar, Miguel; Rincón, Francisco
2007-12-12
The furthering of scientific knowledge depends in part upon the reproducibility of experimental results. When experimental conditions are not set with sufficient precision, the resulting background noise often leads to poorly reproduced and even faulty experiments. An example of the catastrophic consequences of this background noise can be found in the design of strategies for the development of solutions aimed at preventing garlic paste greening, where reported results are contradictory. To avoid such consequences, this paper presents a two-step strategy based on the concept of experimental design. In the first step, the critical factors inherent to the problem are identified, using a 2(III)(7-4) Plackett-Burman experimental design, from a list of seven apparent critical factors (ACF); subsequently, the critical factors thus identified are considered as the factors to be optimized (FO), and optimization is performed using a Box and Wilson experimental design to identify the stationary point of the system. Optimal conditions for preventing garlic greening are examined after analysis of the complex process of green-pigment development, which involves both chemical and enzymatic reactions and is strongly influenced by pH, with an overall pH optimum of 4.5. The critical step in the greening process is the synthesis of thiosulfinates (allicin) from cysteine sulfoxides (alliin). Cysteine inhibits the greening process at this critical stage; no greening precursors are formed in the presence of around 1% cysteine. However, the optimal conditions for greening prevention are very sensitive both to the type of garlic and to manufacturing conditions. This suggests that optimal solutions for garlic greening prevention should be sought on a case-by-case basis, using the strategy presented here.
2014-01-01
A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions. PMID:24685151
Muzyka, Kateryna; Karim, Khalku; Guerreiro, Antonio; Poma, Alessandro; Piletsky, Sergey
2014-03-31
A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions.
NASA Astrophysics Data System (ADS)
Muzyka, Kateryna; Karim, Khalku; Guerreiro, Antonio; Poma, Alessandro; Piletsky, Sergey
2014-03-01
A novel optimized protocol for solid-state synthesis of molecularly imprinted polymer nanoparticles (nanoMIPs) with specificity for antibiotic vancomycin is described. The experimental objective was optimization of the synthesis parameters (factors) affecting the yield of obtained nanoparticles which have been synthesized using the first prototype of an automated solid-phase synthesizer. Applications of experimental design (or design of experiments) in optimization of nanoMIP yield were carried out using MODDE 9.0 software. The factors chosen in the model were the amount of functional monomers in the polymerization mixture, irradiation time, temperature during polymerization, and elution temperature. In general, it could be concluded that the irradiation time is the most important and the temperature was the least important factor which influences the yield of nanoparticles. Overall, the response surface methodology proved to be an effective tool in reducing time required for optimization of complex experimental conditions.
NASA Astrophysics Data System (ADS)
Nishida, R. T.; Beale, S. B.; Pharoah, J. G.; de Haart, L. G. J.; Blum, L.
2018-01-01
This work is among the first where the results of an extensive experimental research programme are compared to performance calculations of a comprehensive computational fluid dynamics model for a solid oxide fuel cell stack. The model, which combines electrochemical reactions with momentum, heat, and mass transport, is used to obtain results for an established industrial-scale fuel cell stack design with complex manifolds. To validate the model, comparisons with experimentally gathered voltage and temperature data are made for the Jülich Mark-F, 18-cell stack operating in a test furnace. Good agreement is obtained between the model and experiment results for cell voltages and temperature distributions, confirming the validity of the computational methodology for stack design. The transient effects during ramp up of current in the experiment may explain a lower average voltage than model predictions for the power curve.
Enzymatic catalysis treatment method of meat industry wastewater using lacasse.
Thirugnanasambandham, K; Sivakumar, V
2015-01-01
The process of meat industry produces in a large amount of wastewater that contains high levels of colour and chemical oxygen demand (COD). So they must be pretreated before their discharge into the ecological system. In this paper, enzymatic catalysis (EC) was adopted to treat the meat wastewater. Box-Behnken design (BBD), an experimental design for response surface methodology (RSM), was used to create a set of 29 experimental runs needed for optimizing of the operating conditions. Quadratic regression models with estimated coefficients were developed to describe the colour and COD removals. The experimental results show that EC could effectively reduce colour (95 %) and COD (86 %) at the optimum conditions of enzyme dose of 110 U/L, incubation time of 100 min, pH of 7 and temperature of 40 °C. RSM could be effectively adopted to optimize the operating multifactors in complex EC process.
NASA Astrophysics Data System (ADS)
Trautz, Andrew C.; Illangasekare, Tissa H.; Rodriguez-Iturbe, Ignacio; Heck, Katharina; Helmig, Rainer
2017-04-01
The atmosphere, soils, and vegetation near the land-atmosphere interface are in a state of continuous dynamic interaction via a myriad of complex interrelated feedback processes which collectively, remain poorly understood. Studying the fundamental nature and dynamics of such processes in atmospheric, ecological, and/or hydrological contexts in the field setting presents many challenges; current experimental approaches are an important factor given a general lack of control and high measurement uncertainty. In an effort to address these issues and reduce overall complexity, new experimental design considerations (two-dimensional intermediate-scale coupled wind tunnel-synthetic aquifer testing using synthetic plants) for studying soil-plant-atmosphere continuum soil moisture dynamics are introduced and tested in this study. Validation of these experimental considerations, particularly the adoption of synthetic plants, is required prior to their application in future research. A comparison of three experiments with bare soil surfaces or transplanted with a Stargazer lily/limestone block was used to evaluate the feasibility of the proposed approaches. Results demonstrate that coupled wind tunnel-porous media experimentation, used to simulate field conditions, reduces complexity, and enhances control while allowing fine spatial-temporal resolution measurements to be made using state-of-the-art technologies. Synthetic plants further help reduce system complexity (e.g., airflow) while preserving the basic hydrodynamic functions of plants (e.g., water uptake and transpiration). The trends and distributions of key measured atmospheric and subsurface spatial and temporal variables (e.g., soil moisture, relative humidity, temperature, air velocity) were comparable, showing that synthetic plants can be used as simple, idealized, nonbiological analogs for living vegetation in fundamental hydrodynamic studies.
Murado, M A; Prieto, M A
2013-09-01
NOEC and LOEC (no and lowest observed effect concentrations, respectively) are toxicological concepts derived from analysis of variance (ANOVA), a not very sensitive method that produces ambiguous results and does not provide confidence intervals (CI) of its estimates. For a long time, despite the abundant criticism that such concepts have raised, the field of the ecotoxicology is reticent to abandon them (two possible reasons will be discussed), adducing the difficulty of clear alternatives. However, this work proves that a debugged dose-response (DR) modeling, through explicit algebraic equations, enables two simple options to accurately calculate the CI of substantially lower doses than NOEC. Both ANOVA and DR analyses are affected by the experimental error, response profile, number of observations and experimental design. The study of these effects--analytically complex and experimentally unfeasible--was carried out using systematic simulations with realistic data, including different error levels. Results revealed the weakness of NOEC and LOEC notions, confirmed the feasibility of the proposed alternatives and allowed to discuss the--often violated--conditions that minimize the CI of the parametric estimates from DR assays. In addition, a table was developed providing the experimental design that minimizes the parametric CI for a given set of working conditions. This makes possible to reduce the experimental effort and to avoid the inconclusive results that are frequently obtained from intuitive experimental plans. Copyright © 2013 Elsevier B.V. All rights reserved.
Schlötterer, C; Kofler, R; Versace, E; Tobler, R; Franssen, S U
2015-05-01
Evolve and resequence (E&R) is a new approach to investigate the genomic responses to selection during experimental evolution. By using whole genome sequencing of pools of individuals (Pool-Seq), this method can identify selected variants in controlled and replicable experimental settings. Reviewing the current state of the field, we show that E&R can be powerful enough to identify causative genes and possibly even single-nucleotide polymorphisms. We also discuss how the experimental design and the complexity of the trait could result in a large number of false positive candidates. We suggest experimental and analytical strategies to maximize the power of E&R to uncover the genotype-phenotype link and serve as an important research tool for a broad range of evolutionary questions.
The Role of the New mTOR Complex, mTORC2, in Autism Spectrum Disorders
2016-10-01
memory in Pten fb-KO mice. a, Schematic of experimental design . b, For contextual fear conditioning, freezing times were recorded 24 hr after...official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No...fb-DKO mice, the opposite is true , namely mTORC2 activity remains up-regulated by mTORC1 activity is normalized (Fig. 1). Hence, conditional
Heat transfer correlations for multilayer insulation systems
NASA Astrophysics Data System (ADS)
Krishnaprakas, C. K.; Badari Narayana, K.; Dutta, Pradip
2000-01-01
Multilayer insulation (MLI) blankets are extensively used in spacecrafts as lightweight thermal protection systems. Heat transfer analysis of MLI is sometimes too complex to use in practical design applications. Hence, for practical engineering design purposes, it is necessary to have simpler procedures to evaluate the heat transfer rate through MLI. In this paper, four different empirical models for heat transfer are evaluated by fitting against experimentally observed heat flux through MLI blankets of various configurations, and the results are discussed.
A theoretical and experimental technique to measure fracture properties in viscoelastic solids
NASA Astrophysics Data System (ADS)
Freitas, Felipe Araujo Colares De
Prediction of crack growth in engineering structures is necessary for better analysis and design. However, this prediction becomes quite complex for certain materials in which the fracture behavior is both rate and path dependent. Asphaltic materials used in pavements have that intrinsic complexity in their behavior. A lot of research effort has been devoted to better understanding viscoelastic behavior and fracture in such materials. This dissertation presents a further refinement of an experimental test setup, which is significantly different from standard testing protocols, to measure viscoelastic and fracture properties of nonlinear viscoelastic solids, such as asphaltic materials. The results presented herein are primarily for experiments with asphalt, but the test procedure can be used for other viscoelastic materials as well. Even though the test is designed as a fracture test, experiments on the investigated materials have uncovered very complex phenomena prior to fracture. Viscoelasticity and micromechanics are used to explain some of the physical phenomena observed in the tests. The material behavior prior to fracture includes both viscoelastic behavior and a necking effect, which is further discussed in the appendix of the present study. The dissertation outlines a theoretical model for the prediction of tractions ahead of the crack tip. The major contribution herein lies in the development of the experimental procedure for evaluating the material parameters necessary for deploying the model in the prediction of ductile crack growth. Finally, predictions of crack growth in a double cantilever beam specimens and asphalt concrete samples are presented in order to demonstrate the power of this approach for predicting crack growth in viscoelastic media.
A computational proposal for designing structured RNA pools for in vitro selection of RNAs.
Kim, Namhee; Gan, Hin Hark; Schlick, Tamar
2007-04-01
Although in vitro selection technology is a versatile experimental tool for discovering novel synthetic RNA molecules, finding complex RNA molecules is difficult because most RNAs identified from random sequence pools are simple motifs, consistent with recent computational analysis of such sequence pools. Thus, enriching in vitro selection pools with complex structures could increase the probability of discovering novel RNAs. Here we develop an approach for engineering sequence pools that links RNA sequence space regions with corresponding structural distributions via a "mixing matrix" approach combined with a graph theory analysis. We define five classes of mixing matrices motivated by covariance mutations in RNA; these constructs define nucleotide transition rates and are applied to chosen starting sequences to yield specific nonrandom pools. We examine the coverage of sequence space as a function of the mixing matrix and starting sequence via clustering analysis. We show that, in contrast to random sequences, which are associated only with a local region of sequence space, our designed pools, including a structured pool for GTP aptamers, can target specific motifs. It follows that experimental synthesis of designed pools can benefit from using optimized starting sequences, mixing matrices, and pool fractions associated with each of our constructed pools as a guide. Automation of our approach could provide practical tools for pool design applications for in vitro selection of RNAs and related problems.
Biological basis for space-variant sensor design I: parameters of monkey and human spatial vision
NASA Astrophysics Data System (ADS)
Rojer, Alan S.; Schwartz, Eric L.
1991-02-01
Biological sensor design has long provided inspiration for sensor design in machine vision. However relatively little attention has been paid to the actual design parameters provided by biological systems as opposed to the general nature of biological vision architectures. In the present paper we will provide a review of current knowledge of primate spatial vision design parameters and will present recent experimental and modeling work from our lab which demonstrates that a numerical conformal mapping which is a refinement of our previous complex logarithmic model provides the best current summary of this feature of the primate visual system. In this paper we will review recent work from our laboratory which has characterized some of the spatial architectures of the primate visual system. In particular we will review experimental and modeling studies which indicate that: . The global spatial architecture of primate visual cortex is well summarized by a numerical conformal mapping whose simplest analytic approximation is the complex logarithm function . The columnar sub-structure of primate visual cortex can be well summarized by a model based on a band-pass filtered white noise. We will also refer to ongoing work in our lab which demonstrates that: . The joint columnar/map structure of primate visual cortex can be modeled and summarized in terms of a new algorithm the ''''proto-column'''' algorithm. This work provides a reference-point for current engineering approaches to novel architectures for
Experimental Investigation of Fibre Reinforced Composite Materials Under Impact Load
NASA Astrophysics Data System (ADS)
Koppula, Sravani; Kaviti, Ajay kumar; Namala, Kiran kumar
2018-03-01
Composite materials are extensively used in various engineering applications. They have very high flexibility design which allows prescribe tailoring of material properties by lamination of composite fibres with reinforcement of resin to it. Complex failure condition prevail in the composite materials under the action of impact loads, major modes of failure in composite may include matrix cracking, fibre matrix, fibre breakage, de-bonding or de- lamination between composite plies. This paper describes the mechanical properties of glass fibre reinforced composite material under impact loading conditions through experimental setup. Experimental tests are performed according to ASTM standards using impact testing machines like Charpy test, computerized universal testing machine.
Gene Profiling in Experimental Models of Eye Growth: Clues to Myopia Pathogenesis
Stone, Richard A.; Khurana, Tejvir S.
2010-01-01
To understand the complex regulatory pathways that underlie the development of refractive errors, expression profiling has evaluated gene expression in ocular tissues of well-characterized experimental models that alter postnatal eye growth and induce refractive errors. Derived from a variety of platforms (e.g. differential display, spotted microarrays or Affymetrix GeneChips), gene expression patterns are now being identified in species that include chicken, mouse and primate. Reconciling available results is hindered by varied experimental designs and analytical/statistical features. Continued application of these methods offers promise to provide the much-needed mechanistic framework to develop therapies to normalize refractive development in children. PMID:20363242
Single-stranded DNA and RNA origami.
Han, Dongran; Qi, Xiaodong; Myhrvold, Cameron; Wang, Bei; Dai, Mingjie; Jiang, Shuoxing; Bates, Maxwell; Liu, Yan; An, Byoungkwon; Zhang, Fei; Yan, Hao; Yin, Peng
2017-12-15
Self-folding of an information-carrying polymer into a defined structure is foundational to biology and offers attractive potential as a synthetic strategy. Although multicomponent self-assembly has produced complex synthetic nanostructures, unimolecular folding has seen limited progress. We describe a framework to design and synthesize a single DNA or RNA strand to self-fold into a complex yet unknotted structure that approximates an arbitrary user-prescribed shape. We experimentally construct diverse multikilobase single-stranded structures, including a ~10,000-nucleotide (nt) DNA structure and a ~6000-nt RNA structure. We demonstrate facile replication of the strand in vitro and in living cells. The work here thus establishes unimolecular folding as a general strategy for constructing complex and replicable nucleic acid nanostructures, and expands the design space and material scalability for bottom-up nanotechnology. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Bacterial flagella and Type III secretion: case studies in the evolution of complexity.
Pallen, M J; Gophna, U
2007-01-01
Bacterial flagella at first sight appear uniquely sophisticated in structure, so much so that they have even been considered 'irreducibly complex' by the intelligent design movement. However, a more detailed analysis reveals that these remarkable pieces of molecular machinery are the product of processes that are fully compatible with Darwinian evolution. In this chapter we present evidence for such processes, based on a review of experimental studies, molecular phylogeny and microbial genomics. Several processes have played important roles in flagellar evolution: self-assembly of simple repeating subunits, gene duplication with subsequent divergence, recruitment of elements from other systems ('molecular bricolage'), and recombination. We also discuss additional tentative new assignments of homology (FliG with MgtE, FliO with YscJ). In conclusion, rather than providing evidence of intelligent design, flagellar and non-flagellar Type III secretion systems instead provide excellent case studies in the evolution of complex systems from simpler components.
Modelling and identification for control of gas bearings
NASA Astrophysics Data System (ADS)
Theisen, Lukas R. S.; Niemann, Hans H.; Santos, Ilmar F.; Galeazzi, Roberto; Blanke, Mogens
2016-03-01
Gas bearings are popular for their high speed capabilities, low friction and clean operation, but suffer from poor damping, which poses challenges for safe operation in presence of disturbances. Feedback control can achieve enhanced damping but requires low complexity models of the dominant dynamics over its entire operating range. Models from first principles are complex and sensitive to parameter uncertainty. This paper presents an experimental technique for "in situ" identification of a low complexity model of a rotor-bearing-actuator system and demonstrates identification over relevant ranges of rotational speed and gas injection pressure. This is obtained using parameter-varying linear models that are found to capture the dominant dynamics. The approach is shown to be easily applied and to suit subsequent control design. Based on the identified models, decentralised proportional control is designed and shown to obtain the required damping in theory and in a laboratory test rig.
HERMIES-3: A step toward autonomous mobility, manipulation, and perception
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Burks, B. L.; Einstein, J. R.; Feezell, R. R.; Manges, W. W.; Thompson, D. H.
1989-01-01
HERMIES-III is an autonomous robot comprised of a seven degree-of-freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omni-directional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information.
Human Systems Integration at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
McCandless, Jeffrey
2017-01-01
The Human Systems Integration Division focuses on the design and operations of complex aerospace systems through analysis, experimentation and modeling. With over a dozen labs and over 120 people, the division conducts research to improve safety, efficiency and mission success. Areas of investigation include applied vision research which will be discussed during this seminar.
Two-Year Impacts of Opportunity NYC by Families' Likelihood of Earning Rewards
ERIC Educational Resources Information Center
Berg, Juliette; Morris, Pamela; Aber, J. Lawrence
2011-01-01
Experimental approaches can help disentangle the impacts of policies from the effects of individual characteristics, but the heterogeneity of implementation inherent in studies with complex program designs may mask average treatment impacts (Morris & Hendra, 2009). In the case of the Opportunity NYC-Family Rewards (ONYC-Family Rewards),…
Towards a rational design of ruthenium CO2 hydrogenation catalysts by Ab initio metadynamics.
Urakawa, Atsushi; Iannuzzi, Marcella; Hutter, Jürg; Baiker, Alfons
2007-01-01
Complete reaction pathways relevant to CO2 hydrogenation by using a homogeneous ruthenium dihydride catalyst ([Ru(dmpe)2H2], dmpe=Me2PCH2CH2PMe2) have been investigated by ab initio metadynamics. This approach has allowed reaction intermediates to be identified and free-energy profiles to be calculated, which provide new insights into the experimentally observed reaction pathway. Our simulations indicate that CO2 insertion, which leads to the formation of formate complexes, proceeds by a concerted insertion mechanism. It is a rapid and direct process with a relatively low activation barrier, which is in agreement with experimental observations. Subsequent H2 insertion into the formate--Ru complex, which leads to the formation of formic acid, instead occurs via an intermediate [Ru(eta2-H2)] complex in which the molecular hydrogen coordinates to the ruthenium center and interacts weakly with the formate group. This step has been identified as the rate-limiting step. The reaction completes by hydrogen transfer from the [Ru(eta2-H2)] complex to the formate oxygen atom, which forms a dihydrogen-bonded Ru--HHO(CHO) complex. The activation energy for the H2 insertion step is lower for the trans isomer than for the cis isomer. A simple measure of the catalytic activity was proposed based on the structure of the transition state of the identified rate-limiting step. From this measure, the relationship between catalysts with different ligands and their experimental catalytic activities can be explained.
designGG: an R-package and web tool for the optimal design of genetical genomics experiments.
Li, Yang; Swertz, Morris A; Vera, Gonzalo; Fu, Jingyuan; Breitling, Rainer; Jansen, Ritsert C
2009-06-18
High-dimensional biomolecular profiling of genetically different individuals in one or more environmental conditions is an increasingly popular strategy for exploring the functioning of complex biological systems. The optimal design of such genetical genomics experiments in a cost-efficient and effective way is not trivial. This paper presents designGG, an R package for designing optimal genetical genomics experiments. A web implementation for designGG is available at http://gbic.biol.rug.nl/designGG. All software, including source code and documentation, is freely available. DesignGG allows users to intelligently select and allocate individuals to experimental units and conditions such as drug treatment. The user can maximize the power and resolution of detecting genetic, environmental and interaction effects in a genome-wide or local mode by giving more weight to genome regions of special interest, such as previously detected phenotypic quantitative trait loci. This will help to achieve high power and more accurate estimates of the effects of interesting factors, and thus yield a more reliable biological interpretation of data. DesignGG is applicable to linkage analysis of experimental crosses, e.g. recombinant inbred lines, as well as to association analysis of natural populations.
Proteome Dynamics: Revisiting Turnover with a Global Perspective*
Claydon, Amy J.; Beynon, Robert
2012-01-01
Although bulk protein turnover has been measured with the use of stable isotope labeled tracers for over half a century, it is only recently that the same approach has become applicable to the level of the proteome, permitting analysis of the turnover of many proteins instead of single proteins or an aggregated protein pool. The optimal experimental design for turnover studies is dependent on the nature of the biological system under study, which dictates the choice of precursor label, protein pool sampling strategy, and treatment of data. In this review we discuss different approaches and, in particular, explore how complexity in experimental design and data processing increases as we shift from unicellular to multicellular systems, in particular animals. PMID:23125033
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2015-01-01
This report documents a case study on the application of Reliability Engineering techniques to achieve an optimal balance between performance and robustness by tuning the functional parameters of a complex non-linear control system. For complex systems with intricate and non-linear patterns of interaction between system components, analytical derivation of a mathematical model of system performance and robustness in terms of functional parameters may not be feasible or cost-effective. The demonstrated approach is simple, structured, effective, repeatable, and cost and time efficient. This general approach is suitable for a wide range of systems.
Elegant Gaussian beams for enhanced optical manipulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alpmann, Christina, E-mail: c.alpmann@uni-muenster.de; Schöler, Christoph; Denz, Cornelia
2015-06-15
Generation of micro- and nanostructured complex light beams attains increasing impact in photonics and laser applications. In this contribution, we demonstrate the implementation and experimental realization of the relatively unknown, but highly versatile class of complex-valued Elegant Hermite- and Laguerre-Gaussian beams. These beams create higher trapping forces compared to standard Gaussian light fields due to their propagation changing properties. We demonstrate optical trapping and alignment of complex functional particles as nanocontainers with standard and Elegant Gaussian light beams. Elegant Gaussian beams will inspire manifold applications in optical manipulation, direct laser writing, or microscopy, where the design of the point-spread functionmore » is relevant.« less
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Conformational Transitions upon Ligand Binding: Holo-Structure Prediction from Apo Conformations
Seeliger, Daniel; de Groot, Bert L.
2010-01-01
Biological function of proteins is frequently associated with the formation of complexes with small-molecule ligands. Experimental structure determination of such complexes at atomic resolution, however, can be time-consuming and costly. Computational methods for structure prediction of protein/ligand complexes, particularly docking, are as yet restricted by their limited consideration of receptor flexibility, rendering them not applicable for predicting protein/ligand complexes if large conformational changes of the receptor upon ligand binding are involved. Accurate receptor models in the ligand-bound state (holo structures), however, are a prerequisite for successful structure-based drug design. Hence, if only an unbound (apo) structure is available distinct from the ligand-bound conformation, structure-based drug design is severely limited. We present a method to predict the structure of protein/ligand complexes based solely on the apo structure, the ligand and the radius of gyration of the holo structure. The method is applied to ten cases in which proteins undergo structural rearrangements of up to 7.1 Å backbone RMSD upon ligand binding. In all cases, receptor models within 1.6 Å backbone RMSD to the target were predicted and close-to-native ligand binding poses were obtained for 8 of 10 cases in the top-ranked complex models. A protocol is presented that is expected to enable structure modeling of protein/ligand complexes and structure-based drug design for cases where crystal structures of ligand-bound conformations are not available. PMID:20066034
SUMOFLUX: A Generalized Method for Targeted 13C Metabolic Flux Ratio Analysis
Kogadeeva, Maria
2016-01-01
Metabolic fluxes are a cornerstone of cellular physiology that emerge from a complex interplay of enzymes, carriers, and nutrients. The experimental assessment of in vivo intracellular fluxes using stable isotopic tracers is essential if we are to understand metabolic function and regulation. Flux estimation based on 13C or 2H labeling relies on complex simulation and iterative fitting; processes that necessitate a level of expertise that ordinarily preclude the non-expert user. To overcome this, we have developed SUMOFLUX, a methodology that is broadly applicable to the targeted analysis of 13C-metabolic fluxes. By combining surrogate modeling and machine learning, we trained a predictor to specialize in estimating flux ratios from measurable 13C-data. SUMOFLUX targets specific flux features individually, which makes it fast, user-friendly, applicable to experimental design and robust in terms of experimental noise and exchange flux magnitude. Collectively, we predict that SUMOFLUX's properties realistically pave the way to high-throughput flux analyses. PMID:27626798
Experimental Simulations of Lunar Magma Ocean Crystallization: The Plot (But Not the Crust) Thickens
NASA Technical Reports Server (NTRS)
Draper, D. S.; Rapp, J. F.; Elardo, S. M.; Shearer, C. K., Jr.; Neal, C. R.
2016-01-01
Numerical models of differentiation of a global-scale lunar magma ocean (LMO) have raised as many questions as they have answered. Recent orbital missions and sample studies have provided new context for a large range of lithologies, from the comparatively magnesian "purest anorthosite" reported by to Si-rich domes and spinel-rich clasts with widespread areal distributions. In addition, the GRAIL mission provided strong constraints on lunar crustal density and average thickness. Can this increasingly complex geology be accounted for via the formation and evolution of the LMO? We have in recent years been conducting extensive sets of petrologic experiments designed to fully simulate LMO crystallization, which had not been attempted previously. Here we review the key results from these experiments, which show that LMO differentiation is more complex than initial models suggested. Several important features expected from LMO crystallization models have yet to be reproduced experimentally; combined modelling and experimental work by our group is ongoing.
Experimental observations of a complex, supersonic nozzle concept
NASA Astrophysics Data System (ADS)
Magstadt, Andrew; Berry, Matthew; Glauser, Mark; Ruscher, Christopher; Gogineni, Sivaram; Kiel, Barry; Skytop Turbulence Labs, Syracuse University Team; Spectral Energies, LLC. Team; Air Force Research Laboratory Team
2015-11-01
A complex nozzle concept, which fuses multiple canonical flows together, has been experimentally investigated via pressure, schlieren and PIV in the anechoic chamber at Syracuse University. Motivated by future engine designs of high-performance aircraft, the rectangular, supersonic jet under investigation has a single plane of symmetry, an additional shear layer (referred to as a wall jet) and an aft deck representative of airframe integration. Operating near a Reynolds number of 3 ×106 , the nozzle architecture creates an intricate flow field comprised of high turbulence levels, shocks, shear & boundary layers, and powerful corner vortices. Current data suggest that the wall jet, which is an order of magnitude less energetic than the core, has significant control authority over the acoustic power through some non-linear process. As sound is a direct product of turbulence, experimental and analytical efforts further explore this interesting phenomenon associated with the turbulent flow. The authors acknowledge the funding source, a SBIR Phase II project with Spectral Energies, LLC. and AFRL turbine engine branch under the direction of Dr. Barry Kiel.
Pessêgo, Márcia; Basílio, Nuno; Muñiz, M Carmen; García-Río, Luis
2016-07-06
Counterion competitive complexation is a background process currently ignored by using ionic hosts. Consequently, guest binding constants are strongly affected by the design of the titration experiments in such a way that the results are dependent on the guest concentration and on the presence of added salts, usually buffers. In the present manuscript we show that these experimental difficulties can be overcome by just considering the counterion competitive complexation. Moreover a single titration allows us to obtain not only the true binding constants but also the stoichiometry of the complex showing the formation of 1 : 1 : 1 (host : guest : counterion) complexes. The detection of high stoichiometry complexes is not restricted to a single titration experiment but also to a displacement assay where both competitive and competitive-cooperative complexation models are taken into consideration.
Deciphering assumptions about stepped wedge designs: the case of Ebola vaccine research.
Doussau, Adélaïde; Grady, Christine
2016-12-01
Ethical concerns about randomising persons to a no-treatment arm in the context of Ebola epidemic led to consideration of alternative designs. The stepped wedge (SW) design, in which participants or clusters are randomised to receive an intervention at different time points, gained popularity. Common arguments in favour of using this design are (1) when an intervention is likely to do more good than harm, (2) all participants should receive the experimental intervention at some time point during the study and (3) the design might be preferable for practical reasons. We examine these assumptions when considering Ebola vaccine research. First, based on the claim that a stepped wedge design is indicated when it is likely that the intervention will do more good than harm, we reviewed published and ongoing SW trials to explore previous use of this design to test experimental drugs or vaccines, and found that SW design has never been used for trials of experimental drugs or vaccines. Given that Ebola vaccines were all experimental with no prior efficacy data, the use of a stepped wedge design would have been unprecedented. Second, we show that it is rarely true that all participants receive the intervention in SW studies, but rather, depending on certain design features, all clusters receive the intervention. Third, we explore whether the SW design is appealing for feasibility reasons and point out that there is significant complexity. In the setting of the Ebola epidemic, spatiotemporal variation may have posed problematic challenges to a stepped wedge design for vaccine research. Finally, we propose a set of points to consider for scientific reviewers and ethics committees regarding proposals for SW designs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Improved Modeling of Side-Chain–Base Interactions and Plasticity in Protein–DNA Interface Design
Thyme, Summer B.; Baker, David; Bradley, Philip
2012-01-01
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed “motifs”) was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein–DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. PMID:22426128
Improved modeling of side-chain--base interactions and plasticity in protein--DNA interface design.
Thyme, Summer B; Baker, David; Bradley, Philip
2012-06-08
Combinatorial sequence optimization for protein design requires libraries of discrete side-chain conformations. The discreteness of these libraries is problematic, particularly for long, polar side chains, since favorable interactions can be missed. Previously, an approach to loop remodeling where protein backbone movement is directed by side-chain rotamers predicted to form interactions previously observed in native complexes (termed "motifs") was described. Here, we show how such motif libraries can be incorporated into combinatorial sequence optimization protocols and improve native complex recapitulation. Guided by the motif rotamer searches, we made improvements to the underlying energy function, increasing recapitulation of native interactions. To further test the methods, we carried out a comprehensive experimental scan of amino acid preferences in the I-AniI protein-DNA interface and found that many positions tolerated multiple amino acids. This sequence plasticity is not observed in the computational results because of the fixed-backbone approximation of the model. We improved modeling of this diversity by introducing DNA flexibility and reducing the convergence of the simulated annealing algorithm that drives the design process. In addition to serving as a benchmark, this extensive experimental data set provides insight into the types of interactions essential to maintain the function of this potential gene therapy reagent. Published by Elsevier Ltd.
BμG@Sbase—a microbial gene expression and comparative genomic database
Witney, Adam A.; Waldron, Denise E.; Brooks, Lucy A.; Tyler, Richard H.; Withers, Michael; Stoker, Neil G.; Wren, Brendan W.; Butcher, Philip D.; Hinds, Jason
2012-01-01
The reducing cost of high-throughput functional genomic technologies is creating a deluge of high volume, complex data, placing the burden on bioinformatics resources and tool development. The Bacterial Microarray Group at St George's (BμG@S) has been at the forefront of bacterial microarray design and analysis for over a decade and while serving as a hub of a global network of microbial research groups has developed BμG@Sbase, a microbial gene expression and comparative genomic database. BμG@Sbase (http://bugs.sgul.ac.uk/bugsbase/) is a web-browsable, expertly curated, MIAME-compliant database that stores comprehensive experimental annotation and multiple raw and analysed data formats. Consistent annotation is enabled through a structured set of web forms, which guide the user through the process following a set of best practices and controlled vocabulary. The database currently contains 86 expertly curated publicly available data sets (with a further 124 not yet published) and full annotation information for 59 bacterial microarray designs. The data can be browsed and queried using an explorer-like interface; integrating intuitive tree diagrams to present complex experimental details clearly and concisely. Furthermore the modular design of the database will provide a robust platform for integrating other data types beyond microarrays into a more Systems analysis based future. PMID:21948792
BμG@Sbase--a microbial gene expression and comparative genomic database.
Witney, Adam A; Waldron, Denise E; Brooks, Lucy A; Tyler, Richard H; Withers, Michael; Stoker, Neil G; Wren, Brendan W; Butcher, Philip D; Hinds, Jason
2012-01-01
The reducing cost of high-throughput functional genomic technologies is creating a deluge of high volume, complex data, placing the burden on bioinformatics resources and tool development. The Bacterial Microarray Group at St George's (BμG@S) has been at the forefront of bacterial microarray design and analysis for over a decade and while serving as a hub of a global network of microbial research groups has developed BμG@Sbase, a microbial gene expression and comparative genomic database. BμG@Sbase (http://bugs.sgul.ac.uk/bugsbase/) is a web-browsable, expertly curated, MIAME-compliant database that stores comprehensive experimental annotation and multiple raw and analysed data formats. Consistent annotation is enabled through a structured set of web forms, which guide the user through the process following a set of best practices and controlled vocabulary. The database currently contains 86 expertly curated publicly available data sets (with a further 124 not yet published) and full annotation information for 59 bacterial microarray designs. The data can be browsed and queried using an explorer-like interface; integrating intuitive tree diagrams to present complex experimental details clearly and concisely. Furthermore the modular design of the database will provide a robust platform for integrating other data types beyond microarrays into a more Systems analysis based future.
Interactive orbital proximity operations planning system
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Ellis, Stephen R.
1988-01-01
An interactive graphical proximity operations planning system was developed, which allows on-site design of efficient, complex, multiburn maneuvers in a dynamic multispacecraft environment. Maneuvering takes place in and out of the orbital plane. The difficulty in planning such missions results from the unusual and counterintuitive character of orbital dynamics and complex time-varying operational constraints. This difficulty is greatly overcome by visualizing the relative trajectories and the relevant constraints in an easily interpretable graphical format, which provides the operator with immediate feedback on design actions. The display shows a perspective bird's-eye view of a Space Station and co-orbiting spacecraft on the background of the Station's orbital plane. The operator has control over the two modes of operation: a viewing system mode, which enables the exporation of the spatial situation about the Space Station and thus the ability to choose and zoom in on areas of interest; and a trajectory design mode, which allows the interactive editing of a series of way points and maneuvering burns to obtain a trajectory that complies with all operational constraints. A first version of this display was completed. An experimental program is planned in which operators will carry out a series of design missions which vary in complexity and constraints.
Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao
2009-01-01
Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650
Proposal for probing energy transfer pathway by single-molecule pump-dump experiment.
Tao, Ming-Jie; Ai, Qing; Deng, Fu-Guo; Cheng, Yuan-Chung
2016-06-09
The structure of Fenna-Matthews-Olson (FMO) light-harvesting complex had long been recognized as containing seven bacteriochlorophyll (BChl) molecules. Recently, an additional BChl molecule was discovered in the crystal structure of the FMO complex, which may serve as a link between baseplate and the remaining seven molecules. Here, we investigate excitation energy transfer (EET) process by simulating single-molecule pump-dump experiment in the eight-molecules complex. We adopt the coherent modified Redfield theory and non-Markovian quantum jump method to simulate EET dynamics. This scheme provides a practical approach of detecting the realistic EET pathway in BChl complexes with currently available experimental technology. And it may assist optimizing design of artificial light-harvesting devices.
Proposal for probing energy transfer pathway by single-molecule pump-dump experiment
NASA Astrophysics Data System (ADS)
Tao, Ming-Jie; Ai, Qing; Deng, Fu-Guo; Cheng, Yuan-Chung
2016-06-01
The structure of Fenna-Matthews-Olson (FMO) light-harvesting complex had long been recognized as containing seven bacteriochlorophyll (BChl) molecules. Recently, an additional BChl molecule was discovered in the crystal structure of the FMO complex, which may serve as a link between baseplate and the remaining seven molecules. Here, we investigate excitation energy transfer (EET) process by simulating single-molecule pump-dump experiment in the eight-molecules complex. We adopt the coherent modified Redfield theory and non-Markovian quantum jump method to simulate EET dynamics. This scheme provides a practical approach of detecting the realistic EET pathway in BChl complexes with currently available experimental technology. And it may assist optimizing design of artificial light-harvesting devices.
Proposal for probing energy transfer pathway by single-molecule pump-dump experiment
Tao, Ming-Jie; Ai, Qing; Deng, Fu-Guo; Cheng, Yuan-Chung
2016-01-01
The structure of Fenna-Matthews-Olson (FMO) light-harvesting complex had long been recognized as containing seven bacteriochlorophyll (BChl) molecules. Recently, an additional BChl molecule was discovered in the crystal structure of the FMO complex, which may serve as a link between baseplate and the remaining seven molecules. Here, we investigate excitation energy transfer (EET) process by simulating single-molecule pump-dump experiment in the eight-molecules complex. We adopt the coherent modified Redfield theory and non-Markovian quantum jump method to simulate EET dynamics. This scheme provides a practical approach of detecting the realistic EET pathway in BChl complexes with currently available experimental technology. And it may assist optimizing design of artificial light-harvesting devices. PMID:27277702
Materials-by-design: computation, synthesis, and characterization from atoms to structures
NASA Astrophysics Data System (ADS)
Yeo, Jingjie; Jung, Gang Seob; Martín-Martínez, Francisco J.; Ling, Shengjie; Gu, Grace X.; Qin, Zhao; Buehler, Markus J.
2018-05-01
In the 50 years that succeeded Richard Feynman’s exposition of the idea that there is ‘plenty of room at the bottom’ for manipulating individual atoms for the synthesis and manufacturing processing of materials, the materials-by-design paradigm is being developed gradually through synergistic integration of experimental material synthesis and characterization with predictive computational modeling and optimization. This paper reviews how this paradigm creates the possibility to develop materials according to specific, rational designs from the molecular to the macroscopic scale. We discuss promising techniques in experimental small-scale material synthesis and large-scale fabrication methods to manipulate atomistic or macroscale structures, which can be designed by computational modeling. These include recombinant protein technology to produce peptides and proteins with tailored sequences encoded by recombinant DNA, self-assembly processes induced by conformational transition of proteins, additive manufacturing for designing complex structures, and qualitative and quantitative characterization of materials at different length scales. We describe important material characterization techniques using numerous methods of spectroscopy and microscopy. We detail numerous multi-scale computational modeling techniques that complements these experimental techniques: DFT at the atomistic scale; fully atomistic and coarse-grain molecular dynamics at the molecular to mesoscale; continuum modeling at the macroscale. Additionally, we present case studies that utilize experimental and computational approaches in an integrated manner to broaden our understanding of the properties of two-dimensional materials and materials based on silk and silk-elastin-like proteins.
[Radiotherapy phase I trials' methodology: Features].
Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N
2016-12-01
In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Bomboi, Francesca; Romano, Flavio; Leo, Manuela; Fernandez-Castanon, Javier; Cerbino, Roberto; Bellini, Tommaso; Bordi, Federico; Filetici, Patrizia; Sciortino, Francesco
2016-01-01
DNA is acquiring a primary role in material development, self-assembling by design into complex supramolecular aggregates, the building block of a new-materials world. Using DNA nanoconstructs to translate sophisticated theoretical intuitions into experimental realizations by closely matching idealized models of colloidal particles is a much less explored avenue. Here we experimentally show that an appropriate selection of competing interactions enciphered in multiple DNA sequences results into the successful design of a one-pot DNA hydrogel that melts both on heating and on cooling. The relaxation time, measured by light scattering, slows down dramatically in a limited window of temperatures. The phase diagram displays a peculiar re-entrant shape, the hallmark of the competition between different bonding patterns. Our study shows that it is possible to rationally design biocompatible bulk materials with unconventional phase diagrams and tuneable properties by encoding into DNA sequences both the particle shape and the physics of the collective response. PMID:27767029
Experimental and computational studies of electromagnetic cloaking at microwaves
NASA Astrophysics Data System (ADS)
Wang, Xiaohui
An invisibility cloak is a device that can hide the target by enclosing it from the incident radiation. This intriguing device has attracted a lot of attention since it was first implemented at a microwave frequency in 2006. However, the problems of existing cloak designs prevent them from being widely applied in practice. In this dissertation, we try to remove or alleviate the three constraints for practical applications imposed by loosy cloaking media, high implementation complexity, and small size of hidden objects compared to the incident wavelength. To facilitate cloaking design and experimental characterization, several devices and relevant techniques for measuring the complex permittivity of dielectric materials at microwave frequencies are developed. In particular, a unique parallel plate waveguide chamber has been set up to automatically map the electromagnetic (EM) field distribution for wave propagation through the resonator arrays and cloaking structures. The total scattering cross section of the cloaking structures was derived based on the measured scattering field by using this apparatus. To overcome the adverse effects of lossy cloaking media, microwave cloaks composed of identical dielectric resonators made of low loss ceramic materials are designed and implemented. The effective permeability dispersion was provided by tailoring dielectric resonator filling fractions. The cloak performances had been verified by full-wave simulation of true multi-resonator structures and experimental measurements of the fabricated prototypes. With the aim to reduce the implementation complexity caused by metamaterials employment for cloaking, we proposed to design 2-D cylindrical cloaks and 3-D spherical cloaks by using multi-layer ordinary dielectric material (epsilon r>1) coating. Genetic algorithm was employed to optimize the dielectric profiles of the cloaking shells to provide the minimum scattering cross sections of the cloaked targets. The designed cloaks can be easily scaled to various operating frequencies. The simulation results show that the multi-layer cylindrical cloak essentially outperforms the similarly sized metamaterials-based cloak designed by using the transformation optics-based reduced parameters. For the designed spherical cloak, the simulated scattering pattern shows that the total scattering cross section is greatly reduced. In addition, the scattering in specific directions could be significantly reduced. It is shown that the cloaking efficiency for larger targets could be improved by employing lossy materials in the shell. At last, we propose to hide a target inside a waveguide structure filled with only epsilon near zero materials, which are easy to implement in practice. The cloaking efficiency of this method, which was found to increase for large targets, has been confirmed both theoretically and by simulations.
Engineering applications of metaheuristics: an introduction
NASA Astrophysics Data System (ADS)
Oliva, Diego; Hinojosa, Salvador; Demeshko, M. V.
2017-01-01
Metaheuristic algorithms are important tools that in recent years have been used extensively in several fields. In engineering, there is a big amount of problems that can be solved from an optimization point of view. This paper is an introduction of how metaheuristics can be used to solve complex problems of engineering. Their use produces accurate results in problems that are computationally expensive. Experimental results support the performance obtained by the selected algorithms in such specific problems as digital filter design, image processing and solar cells design.
The Experimental Measurement of Aerodynamic Heating About Complex Shapes at Supersonic Mach Numbers
NASA Technical Reports Server (NTRS)
Neumann, Richard D.; Freeman, Delma C.
2011-01-01
In 2008 a wind tunnel test program was implemented to update the experimental data available for predicting protuberance heating at supersonic Mach numbers. For this test the Langley Unitary Wind Tunnel was also used. The significant differences for this current test were the advances in the state-of-the-art in model design, fabrication techniques, instrumentation and data acquisition capabilities. This current paper provides a focused discussion of the results of an in depth analysis of unique measurements of recovery temperature obtained during the test.
The Langley Research Center CSI phase-0 evolutionary model testbed-design and experimental results
NASA Technical Reports Server (NTRS)
Belvin, W. K.; Horta, Lucas G.; Elliott, K. B.
1991-01-01
A testbed for the development of Controls Structures Interaction (CSI) technology is described. The design philosophy, capabilities, and early experimental results are presented to introduce some of the ongoing CSI research at NASA-Langley. The testbed, referred to as the Phase 0 version of the CSI Evolutionary model (CEM), is the first stage of model complexity designed to show the benefits of CSI technology and to identify weaknesses in current capabilities. Early closed loop test results have shown non-model based controllers can provide an order of magnitude increase in damping in the first few flexible vibration modes. Model based controllers for higher performance will need to be robust to model uncertainty as verified by System ID tests. Data are presented that show finite element model predictions of frequency differ from those obtained from tests. Plans are also presented for evolution of the CEM to study integrated controller and structure design as well as multiple payload dynamics.
Kirigami artificial muscles with complex biologically inspired morphologies
NASA Astrophysics Data System (ADS)
Sareh, Sina; Rossiter, Jonathan
2013-01-01
In this paper we present bio-inspired smart structures which exploit the actuation of flexible ionic polymer composites and the kirigami design principle. Kirigami design is used to convert planar actuators into active 3D structures capable of large out-of-plane displacement and that replicate biological mechanisms. Here we present the burstbot, a fluid control and propulsion mechanism based on the atrioventricular cuspid valve, and the vortibot, a spiral actuator based on Vorticella campanula, a ciliate protozoa. Models derived from biological counterparts are used as a platform for design optimization and actuator performance measurement. The symmetric and asymmetric fluid interactions of the burstbot are investigated and the effectiveness in fluid transport applications is demonstrated. The vortibot actuator is geometrically optimized as a camera positioner capable of 360° scanning. Experimental results for a one-turn spiral actuator show complex actuation derived from a single degree of freedom control signal.
Spectral-spatial classification of hyperspectral image using three-dimensional convolution network
NASA Astrophysics Data System (ADS)
Liu, Bing; Yu, Xuchu; Zhang, Pengqiang; Tan, Xiong; Wang, Ruirui; Zhi, Lu
2018-01-01
Recently, hyperspectral image (HSI) classification has become a focus of research. However, the complex structure of an HSI makes feature extraction difficult to achieve. Most current methods build classifiers based on complex handcrafted features computed from the raw inputs. The design of an improved 3-D convolutional neural network (3D-CNN) model for HSI classification is described. This model extracts features from both the spectral and spatial dimensions through the application of 3-D convolutions, thereby capturing the important discrimination information encoded in multiple adjacent bands. The designed model views the HSI cube data altogether without relying on any pre- or postprocessing. In addition, the model is trained in an end-to-end fashion without any handcrafted features. The designed model was applied to three widely used HSI datasets. The experimental results demonstrate that the 3D-CNN-based method outperforms conventional methods even with limited labeled training samples.
Task Design Influences Prosociality in Captive Chimpanzees (Pan troglodytes)
House, Bailey R.; Silk, Joan B.; Lambeth, Susan P.; Schapiro, Steven J.
2014-01-01
Chimpanzees confer benefits on group members, both in the wild and in captive populations. Experimental studies of how animals allocate resources can provide useful insights about the motivations underlying prosocial behavior, and understanding the relationship between task design and prosocial behavior provides an important foundation for future research exploring these animals' social preferences. A number of studies have been designed to assess chimpanzees' preferences for outcomes that benefit others (prosocial preferences), but these studies vary greatly in both the results obtained and the methods used, and in most cases employ procedures that reduce critical features of naturalistic social interactions, such as partner choice. The focus of the current study is on understanding the link between experimental methodology and prosocial behavior in captive chimpanzees, rather than on describing these animals' social motivations themselves. We introduce a task design that avoids isolating subjects and allows them to freely decide whether to participate in the experiment. We explore key elements of the methods utilized in previous experiments in an effort to evaluate two possibilities that have been offered to explain why different experimental designs produce different results: (a) chimpanzees are less likely to deliver food to others when they obtain food for themselves, and (b) evidence of prosociality may be obscured by more “complex” experimental apparatuses (e.g., those including more components or alternative choices). Our results suggest that the complexity of laboratory tasks may generate observed variation in prosocial behavior in laboratory experiments, and highlights the need for more naturalistic research designs while also providing one example of such a paradigm. PMID:25191860
Quantum computing gates via optimal control
NASA Astrophysics Data System (ADS)
Atia, Yosi; Elias, Yuval; Mor, Tal; Weinstein, Yossi
2014-10-01
We demonstrate the use of optimal control to design two entropy-manipulating quantum gates which are more complex than the corresponding, commonly used, gates, such as CNOT and Toffoli (CCNOT): A two-qubit gate called polarization exchange (PE) and a three-qubit gate called polarization compression (COMP) were designed using GRAPE, an optimal control algorithm. Both gates were designed for a three-spin system. Our design provided efficient and robust nuclear magnetic resonance (NMR) radio frequency (RF) pulses for 13C2-trichloroethylene (TCE), our chosen three-spin system. We then experimentally applied these two quantum gates onto TCE at the NMR lab. Such design of these gates and others could be relevant for near-future applications of quantum computing devices.
Computer modeling and simulation of human movement. Applications in sport and rehabilitation.
Neptune, R R
2000-05-01
Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.
Toward the establishment of design guidelines for effective 3D perspective interfaces
NASA Astrophysics Data System (ADS)
Fitzhugh, Elisabeth; Dixon, Sharon; Aleva, Denise; Smith, Eric; Ghrayeb, Joseph; Douglas, Lisa
2009-05-01
The propagation of information operation technologies, with correspondingly vast amounts of complex network information to be conveyed, significantly impacts operator workload. Information management research is rife with efforts to develop schemes to aid operators to identify, review, organize, and retrieve the wealth of available data. Data may take on such distinct forms as intelligence libraries, logistics databases, operational environment models, or network topologies. Increased use of taxonomies and semantic technologies opens opportunities to employ network visualization as a display mechanism for diverse information aggregations. The broad applicability of network visualizations is still being tested, but in current usage, the complexity of densely populated abstract networks suggests the potential utility of 3D. Employment of 2.5D in network visualization, using classic perceptual cues, creates a 3D experience within a 2D medium. It is anticipated that use of 3D perspective (2.5D) will enhance user ability to visually inspect large, complex, multidimensional networks. Current research for 2.5D visualizations demonstrates that display attributes, including color, shape, size, lighting, atmospheric effects, and shadows, significantly impact operator experience. However, guidelines for utilization of attributes in display design are limited. This paper discusses pilot experimentation intended to identify potential problem areas arising from these cues and determine how best to optimize perceptual cue settings. Development of optimized design guidelines will ensure that future experiments, comparing network displays with other visualizations, are not confounded or impeded by suboptimal attribute characterization. Current experimentation is anticipated to support development of cost-effective, visually effective methods to implement 3D in military applications.
Experimental design and statistical analysis for three-drug combination studies.
Fang, Hong-Bin; Chen, Xuerong; Pei, Xin-Yan; Grant, Steven; Tan, Ming
2017-06-01
Drug combination is a critically important therapeutic approach for complex diseases such as cancer and HIV due to its potential for efficacy at lower, less toxic doses and the need to move new therapies rapidly into clinical trials. One of the key issues is to identify which combinations are additive, synergistic, or antagonistic. While the value of multidrug combinations has been well recognized in the cancer research community, to our best knowledge, all existing experimental studies rely on fixing the dose of one drug to reduce the dimensionality, e.g. looking at pairwise two-drug combinations, a suboptimal design. Hence, there is an urgent need to develop experimental design and analysis methods for studying multidrug combinations directly. Because the complexity of the problem increases exponentially with the number of constituent drugs, there has been little progress in the development of methods for the design and analysis of high-dimensional drug combinations. In fact, contrary to common mathematical reasoning, the case of three-drug combinations is fundamentally more difficult than two-drug combinations. Apparently, finding doses of the combination, number of combinations, and replicates needed to detect departures from additivity depends on dose-response shapes of individual constituent drugs. Thus, different classes of drugs of different dose-response shapes need to be treated as a separate case. Our application and case studies develop dose finding and sample size method for detecting departures from additivity with several common (linear and log-linear) classes of single dose-response curves. Furthermore, utilizing the geometric features of the interaction index, we propose a nonparametric model to estimate the interaction index surface by B-spine approximation and derive its asymptotic properties. Utilizing the method, we designed and analyzed a combination study of three anticancer drugs, PD184, HA14-1, and CEP3891 inhibiting myeloma H929 cell line. To our best knowledge, this is the first ever three drug combinations study performed based on the original 4D dose-response surface formed by dose ranges of three drugs.
Three types of solid state remote power controllers
NASA Technical Reports Server (NTRS)
Baker, D. E.
1975-01-01
Three types of solid state Remote Power Controller (RPC) circuits for 120 Vdc spacecraft distribution systems have been developed and evaluated. Both current limiting and noncurrent limiting modes of overload protection were developed and were demonstrated to be feasible. A second generation of circuits was developed which offers comparable performance with substantially less cost and complexity. Electrical efficiency for both generations is 98.5 to 99%. This paper describes various aspects of the circuit design, trade-off studies, and experimental test results. Comparisons of design parameters, component requirements, and engineering model evaluations will emphasize the high efficiency and reliability of the designs.
Programming Language Software For Graphics Applications
NASA Technical Reports Server (NTRS)
Beckman, Brian C.
1993-01-01
New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.
ERIC Educational Resources Information Center
Fialkov, M. Jerome; And Others
1983-01-01
The diagnosis and treatment of a 13-year-old female who manifested postencephalitic behavioral syndrome 9 years after an acute measles infection are described, along with the history of the case. The case illustrates that an interdisciplinary approach using a single-case experimental design can be clinically effective. (SEW)
Modeling Complex Marine Ecosystems: An Investigation of Two Teaching Approaches with Fifth Graders
ERIC Educational Resources Information Center
Papaevripidou, M.; Constantinou, C. P.; Zacharia, Z. C.
2007-01-01
This study investigated acquisition and transfer of the modeling ability of fifth graders in various domains. Teaching interventions concentrated on the topic of marine ecosystems either through a modeling-based approach or a worksheet-based approach. A quasi-experimental (pre-post comparison study) design was used. The control group (n = 17)…
ERIC Educational Resources Information Center
Wang, Jack T. H.; Schembri, Mark A.; Ramakrishna, Mathitha; Sagulenko, Evgeny; Fuerst, John A.
2012-01-01
Molecular cloning skills are an essential component of biological research, yet students often do not receive this training during their undergraduate studies. This can be attributed to the complexities of the cloning process, which may require many weeks of progressive design and experimentation. To address this issue, we incorporated an…
A Meta-Analysis of Single Subject Design Writing Intervention Research
ERIC Educational Resources Information Center
Rogers, Leslie Ann; Graham, Steve
2008-01-01
There is considerable concern that students do not develop the writing skills needed for school, occupational, or personal success. A frequent explanation for this is that schools do not do a good job of teaching this complex skill. A recent meta-analysis of true- and quasi-experimental writing intervention research (S. Graham & D. Perin,…
Can a Multimedia Tool Help Students' Learning Performance in Complex Biology Subjects?
ERIC Educational Resources Information Center
Koseoglu, Pinar; Efendioglu, Akin
2015-01-01
The aim of the present study was to determine the effects of multimedia-based biology teaching (Mbio) and teacher-centered biology (TCbio) instruction approaches on learners' biology achievements, as well as their views towards learning approaches. During the research process, an experimental design with two groups, TCbio (n = 22) and Mbio (n =…
The Effectiveness of Multimedia Application on Students Listening Comprehension
ERIC Educational Resources Information Center
Pangaribuan, Tagor; Sinaga, Andromeda; Sipayung, Kammer Tuahman
2017-01-01
Listening comprehension is a complex skill particulaly in mastered by non-native speaker settings. This research aimed at finding out the effect of multimedia application on students' listening. The research design is experimental, with a t-test. The population is the sixth semester of HKBP Nommensen University at the academic year of 2016/2017,…
A Parallel Rendering Algorithm for MIMD Architectures
NASA Technical Reports Server (NTRS)
Crockett, Thomas W.; Orloff, Tobias
1991-01-01
Applications such as animation and scientific visualization demand high performance rendering of complex three dimensional scenes. To deliver the necessary rendering rates, highly parallel hardware architectures are required. The challenge is then to design algorithms and software which effectively use the hardware parallelism. A rendering algorithm targeted to distributed memory MIMD architectures is described. For maximum performance, the algorithm exploits both object-level and pixel-level parallelism. The behavior of the algorithm is examined both analytically and experimentally. Its performance for large numbers of processors is found to be limited primarily by communication overheads. An experimental implementation for the Intel iPSC/860 shows increasing performance from 1 to 128 processors across a wide range of scene complexities. It is shown that minimal modifications to the algorithm will adapt it for use on shared memory architectures as well.
Applying ``intelligent`` materials for materials education: The Labless Lab{trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrade, J.D.; Scheer, R.
1994-12-31
A very large number of science and engineering courses taught in colleges and universities today do not involve laboratories. Although good instructors incorporate class demonstrations, hands on homework, and various teaching aids, including computer simulations, the fact is that students in such courses often accept key concepts and experimental results without discovering them for themselves. The only partial solution to this problem has been increasing use of class demonstrations and computer simulations. The authors feel strongly that many complex concepts can be observed and assimilated through experimentation with properly designed materials. They propose the development of materials and specimens designedmore » specifically for education purposes. Intelligent and communicative materials are ideal for this purpose. Specimens which respond in an observable fashion to new environments and situations provided by the students/experimenter provide a far more effective materials science and engineering experience than readouts and data generated by complex and expensive machines, particularly in an introductory course. Modern materials can be designed to literally communicate with the observer. The authors embarked on a project to develop a series of Labless Labs{trademark} utilizing various degrees and levels of intelligence in materials. It is expected that such Labless Labs{trademark} would be complementary to textbooks and computer simulations and to be used to provide a reality for students in courses and other learning situations where access to a laboratory is non-existent or limited.« less
The fluid mechanics of channel fracturing flows: experiment
NASA Astrophysics Data System (ADS)
Rashedi, Ahmadreza; Tucker, Zachery; Ovarlez, Guillaume; Hormozi, Sarah
2017-11-01
We show our preliminary experimental results on the role of fluid mechanics in channel fracturing flows, particularly yield stress fracturing fluids. Recent trends in the oil industry have included the use of cyclic pumping of a proppant slurry interspersed with a yield stress fracturing fluid, which is found to increase wells productivity, if particles disperse in a certain fashion. Our experimental study aims to investigate the physical mechanisms responsible for dispersing the particles (proppant) within a yield stress carrier fluid, and to measure the dispersion of proppant slugs in various fracturing regimes. To this end we have designed and built a unique experimental setup that resembles a fracture configuration coupled with a particle image/tracking velocimetry setup operating at micro to macro dimensions. Moreover, we have designed optically engineered suspensions of complex fluids with tunable yield stress and consistency, well controlled density match-mismatch properties and refractive indices for both X-rays and visible lights. We present our experimental system and preliminary results. NSF (Grant No. CBET-1554044- CAREER), ACS PRF (Grant No. 55661-DNI9).
Joint histogram-based cost aggregation for stereo matching.
Min, Dongbo; Lu, Jiangbo; Do, Minh N
2013-10-01
This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.
NASA Astrophysics Data System (ADS)
Bezruchko, Konstantin; Davidov, Albert
2009-01-01
In the given article scientific and technical complex for modeling, researching and testing of rocket-space vehicles' power installations which was created in Power Source Laboratory of National Aerospace University "KhAI" is described. This scientific and technical complex gives the opportunity to replace the full-sized tests on model tests and to reduce financial and temporary inputs at modeling, researching and testing of rocket-space vehicles' power installations. Using the given complex it is possible to solve the problems of designing and researching of rocket-space vehicles' power installations efficiently, and also to provide experimental researches of physical processes and tests of solar and chemical batteries of rocket-space complexes and space vehicles. Scientific and technical complex also allows providing accelerated tests, diagnostics, life-time control and restoring of chemical accumulators for rocket-space vehicles' power supply systems.
Computational design and experimental verification of a symmetric protein homodimer.
Mou, Yun; Huang, Po-Ssu; Hsu, Fang-Ciao; Huang, Shing-Jong; Mayo, Stephen L
2015-08-25
Homodimers are the most common type of protein assembly in nature and have distinct features compared with heterodimers and higher order oligomers. Understanding homodimer interactions at the atomic level is critical both for elucidating their biological mechanisms of action and for accurate modeling of complexes of unknown structure. Computation-based design of novel protein-protein interfaces can serve as a bottom-up method to further our understanding of protein interactions. Previous studies have demonstrated that the de novo design of homodimers can be achieved to atomic-level accuracy by β-strand assembly or through metal-mediated interactions. Here, we report the design and experimental characterization of a α-helix-mediated homodimer with C2 symmetry based on a monomeric Drosophila engrailed homeodomain scaffold. A solution NMR structure shows that the homodimer exhibits parallel helical packing similar to the design model. Because the mutations leading to dimer formation resulted in poor thermostability of the system, design success was facilitated by the introduction of independent thermostabilizing mutations into the scaffold. This two-step design approach, function and stabilization, is likely to be generally applicable, especially if the desired scaffold is of low thermostability.
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Boitnott, Richard L.; Fasanella, Edwin L.
1990-01-01
Failure behavior results are presented from crash dynamics research using concepts of aircraft elements and substructure not necessarily designed or optimized for energy absorption or crash loading considerations. To achieve desired new designs which incorporate improved energy absorption capabilities often requires an understanding of how more conventional designs behave under crash loadings. Experimental and analytical data are presented which indicate some general trends in the failure behavior of a class of composite structures which include individual fuselage frames, skeleton subfloors with stringers and floor beams but without skin covering, and subfloors with skin added to the frame-stringer arrangement. Although the behavior is complex, a strong similarity in the static and dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models. It is believed that the similarity in behavior is giving the designer and dynamists much information about what to expect in the crash behavior of these structures and can guide designs for improving the energy absorption and crash behavior of such structures.
NASA Technical Reports Server (NTRS)
Carden, Huey D.; Boitnott, Richard L.; Fasanella, Edwin L.
1990-01-01
Failure behavior results are presented from crash dynamics research using concepts of aircraft elements and substructure not necessarily designed or optimized for energy absorption or crash loading considerations. To achieve desired new designs which incorporate improved energy absorption capabilities often requires an understanding of how more conventional designs behave under crash loadings. Experimental and analytical data are presented which indicate some general trends in the failure behavior of a class of composite structures which include individual fuselage frames, skeleton subfloors with stringers and floor beams but without skin covering, and subfloors with skin added to the frame-stringer arrangement. Although the behavior is complex, a strong similarity in the static and dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models. It is believed that the similarity in behavior is giving the designer and dynamists much information about what to expect in the crash behavior of these structures and can guide designs for improving the energy absorption and crash behavior of such structures.
Unique failure behavior of metal/composite aircraft structural components under crash type loads
NASA Technical Reports Server (NTRS)
Carden, Huey D.
1990-01-01
Failure behavior results are presented on some of the crash dynamics research conducted with concepts of aircraft elements and substructure which have not necessarily been designed or optimized for energy absorption or crash loading considerations. To achieve desired new designs which incorporate improved energy absorption capabilities often requires an understanding of how more conventional designs behave under crash type loadings. Experimental and analytical data are presented which indicate some general trends in the failure behavior of a class of composite structures which include individual fuselage frames, skeleton subfloors with stringers and floor beams but without skin covering, and subfloors with skin added to the frame-stringer arrangement. Although the behavior is complex, a strong similarity in the static/dynamic failure behavior among these structures is illustrated through photographs of the experimental results and through analytical data of generic composite structural models. It is believed that the thread of similarity in behavior is telling the designer and dynamists a great deal about what to expect in the crash behavior of these structures and can guide designs for improving the energy absorption and crash behavior of such structures.
Studies on Stress-Strain Relationships of Polymeric Materials Used in Space Applications
NASA Technical Reports Server (NTRS)
Jana, Sadhan C.; Freed, Alan
2002-01-01
A two-year research plan was undertaken in association with Polymers Branch, NASA Glenn Research Center, to carry out experimental and modeling work relating stress and strain behavior of polymeric materials, especially elastomers and vulcanized rubber. An experimental system based on MTS (Mechanical Testing and Simulation) A/T-4 test facility environment has been developed for a broader range of polymeric materials in addition to a design of laser compatible temperature control chamber for online measurements of various strains. Necessary material processing has been accomplished including rubber compounding and thermoplastic elastomer processing via injection molding. A broad suite of testing methodologies has been identified to reveal the complex non-linear mechanical behaviors of rubbery materials when subjected to complex modes of deformation. This suite of tests required the conceptualization, design and development of new specimen geometries, test fixtures, and test systems including development of a new laser based technique to measure large multi-axial deformations. Test data has been generated for some of these new fixtures and has revealed some complex coupling effects generated during multi-axial deformations. In addition, fundamental research has been conducted concerning the foundation principles of rubber thermodynamics and resulting theories of rubber elasticity. Studies have been completed on morphological properties of several thermoplastic elastomers. Finally, a series of steps have been identified to further advance the goals of NASA's ongoing effort.
Simplified paraboloid phase model-based phase tracker for demodulation of a single complex fringe.
He, A; Deepan, B; Quan, C
2017-09-01
A regularized phase tracker (RPT) is an effective method for demodulation of single closed-fringe patterns. However, lengthy calculation time, specially designed scanning strategy, and sign-ambiguity problems caused by noise and saddle points reduce its effectiveness, especially for demodulating large and complex fringe patterns. In this paper, a simplified paraboloid phase model-based regularized phase tracker (SPRPT) is proposed. In SPRPT, first and second phase derivatives are pre-determined by the density-direction-combined method and discrete higher-order demodulation algorithm, respectively. Hence, cost function is effectively simplified to reduce the computation time significantly. Moreover, pre-determined phase derivatives improve the robustness of the demodulation of closed, complex fringe patterns. Thus, no specifically designed scanning strategy is needed; nevertheless, it is robust against the sign-ambiguity problem. The paraboloid phase model also assures better accuracy and robustness against noise. Both the simulated and experimental fringe patterns (obtained using electronic speckle pattern interferometry) are used to validate the proposed method, and a comparison of the proposed method with existing RPT methods is carried out. The simulation results show that the proposed method has achieved the highest accuracy with less computational time. The experimental result proves the robustness and the accuracy of the proposed method for demodulation of noisy fringe patterns and its feasibility for static and dynamic applications.
Correale, Stefania; de Paola, Ivan; Morgillo, Carmine Marco; Federico, Antonella; Zaccaro, Laura; Pallante, Pierlorenzo; Galeone, Aldo; Fusco, Alfredo; Pedone, Emilia; Luque, F Javier; Catalanotti, Bruno
2014-01-01
UbcH10 is a component of the Ubiquitin Conjugation Enzymes (Ubc; E2) involved in the ubiquitination cascade controlling the cell cycle progression, whereby ubiquitin, activated by E1, is transferred through E2 to the target protein with the involvement of E3 enzymes. In this work we propose the first three dimensional model of the tetrameric complex formed by the human UbA1 (E1), two ubiquitin molecules and UbcH10 (E2), leading to the transthiolation reaction. The 3D model was built up by using an experimentally guided incremental docking strategy that combined homology modeling, protein-protein docking and refinement by means of molecular dynamics simulations. The structural features of the in silico model allowed us to identify the regions that mediate the recognition between the interacting proteins, revealing the active role of the ubiquitin crosslinked to E1 in the complex formation. Finally, the role of these regions involved in the E1-E2 binding was validated by designing short peptides that specifically interfere with the binding of UbcH10, thus supporting the reliability of the proposed model and representing valuable scaffolds for the design of peptidomimetic compounds that can bind selectively to Ubcs and inhibit the ubiquitylation process in pathological disorders.
Zhang, Junming; Wu, Yan
2018-03-28
Many systems are developed for automatic sleep stage classification. However, nearly all models are based on handcrafted features. Because of the large feature space, there are so many features that feature selection should be used. Meanwhile, designing handcrafted features is a difficult and time-consuming task because the feature designing needs domain knowledge of experienced experts. Results vary when different sets of features are chosen to identify sleep stages. Additionally, many features that we may be unaware of exist. However, these features may be important for sleep stage classification. Therefore, a new sleep stage classification system, which is based on the complex-valued convolutional neural network (CCNN), is proposed in this study. Unlike the existing sleep stage methods, our method can automatically extract features from raw electroencephalography data and then classify sleep stage based on the learned features. Additionally, we also prove that the decision boundaries for the real and imaginary parts of a complex-valued convolutional neuron intersect orthogonally. The classification performances of handcrafted features are compared with those of learned features via CCNN. Experimental results show that the proposed method is comparable to the existing methods. CCNN obtains a better classification performance and considerably faster convergence speed than convolutional neural network. Experimental results also show that the proposed method is a useful decision-support tool for automatic sleep stage classification.
NASA Technical Reports Server (NTRS)
Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.
1987-01-01
The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.
Boron-selective reactions as powerful tools for modular synthesis of diverse complex molecules.
Xu, Liang; Zhang, Shuai; Li, Pengfei
2015-12-21
In the context of modular and rapid construction of molecular diversity and complexity for applications in organic synthesis, biomedical and materials sciences, a generally useful strategy has emerged based on boron-selective chemical transformations. In the last decade, these types of reactions have evolved from proof-of-concept to some advanced applications in the efficient preparation of complex natural products and even automated precise manufacturing on the molecular level. These advances have shown the great potential of boron-selective reactions in simplifying synthetic design and experimental operations, and should inspire new developments in related chemical and technological areas. This tutorial review will highlight the original contributions and representative advances in this emerging field.
Single- and two-phase flow in microfluidic porous media analogs based on Voronoi tessellation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Mengjie; Xiao, Feng; Johnson-Paben, Rebecca
2012-01-01
The objective of this study was to create a microfluidic model of complex porous media for studying single and multiphase flows. Most experimental porous media models consist of periodic geometries that lend themselves to comparison with well-developed theoretical predictions. However, most real porous media such as geological formations and biological tissues contain a degree of randomness and complexity that is not adequately represented in periodic geometries. To design an experimental tool to study these complex geometries, we created microfluidic models of random homogeneous and heterogeneous networks based on Voronoi tessellations. These networks consisted of approximately 600 grains separated by amore » highly connected network of channels with an overall porosity of 0.11 0.20. We found that introducing heterogeneities in the form of large cavities within the network changed the permeability in a way that cannot be predicted by the classical porosity-permeability relationship known as the Kozeny equation. The values of permeability found in experiments were in excellent agreement with those calculated from three-dimensional lattice Boltzmann simulations. In two-phase flow experiments of oil displacement with water we found that the surface energy of channel walls determined the pattern of water invasion, while the network topology determined the residual oil saturation. These results suggest that complex network topologies lead to fluid flow behavior that is difficult to predict based solely on porosity. The microfluidic models developed in this study using a novel geometry generation algorithm based on Voronoi tessellation are a new experimental tool for studying fluid and solute transport problems within complex porous media.« less
OFMTutor: An operator function model intelligent tutoring system
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
1989-01-01
The design, implementation, and evaluation of an Operator Function Model intelligent tutoring system (OFMTutor) is presented. OFMTutor is intended to provide intelligent tutoring in the context of complex dynamic systems for which an operator function model (OFM) can be constructed. The human operator's role in such complex, dynamic, and highly automated systems is that of a supervisory controller whose primary responsibilities are routine monitoring and fine-tuning of system parameters and occasional compensation for system abnormalities. The automated systems must support the human operator. One potentially useful form of support is the use of intelligent tutoring systems to teach the operator about the system and how to function within that system. Previous research on intelligent tutoring systems (ITS) is considered. The proposed design for OFMTutor is presented, and an experimental evaluation is described.
Mehio, Nada; Ivanov, Alexander S.; Ladshaw, Austin P.; ...
2015-11-22
Poly(acrylamidoxime) fibers are the current state of the art adsorbent for mining uranium from seawater. However, the competition between uranyl (UO 2 2+) and vanadium ions poses a challenge to mining on the industrial scale. In this work, we employ density functional theory (DFT) and coupled-cluster methods (CCSD(T)) in the restricted formalism to investigate potential binding motifs of the oxovanadium(IV) ion (VO 2+) with the formamidoximate ligand. Consistent with experimental EXAFS data, the hydrated six-coordinate complex is predicted to be preferred over the hydrated five-coordinate complex. Here, our investigation of formamidoximate-VO 2+ complexes universally identified the most stable binding motifmore » formed by chelating a tautomerically rearranged imino hydroxylamine via the imino nitrogen and hydroxylamine oxygen. The alternative binding motifs for amidoxime chelation via a non-rearranged tautomer and 2 coordination are found to be ~11 kcal/mol less stable. Ultimately, the difference in the most stable VO 2+ and UO 2 2+ binding conformation has important implications for the design of more selective UO 2 2+ ligands.« less
Fedorova, Elena V.; Buryakina, Anna V.; Zakharov, Alexey V.; Filimonov, Dmitry A.; Lagunin, Alexey A.; Poroikov, Vladimir V.
2014-01-01
Based on the data about structure and antidiabetic activity of twenty seven vanadium and zinc coordination complexes collected from literature we developed QSAR models using the GUSAR program. These QSAR models were applied to 10 novel vanadium coordination complexes designed in silico in order to predict their hypoglycemic action. The five most promising substances with predicted potent hypoglycemic action were selected for chemical synthesis and pharmacological evaluation. The selected coordination vanadium complexes were synthesized and tested in vitro and in vivo for their hypoglycemic activities and acute rat toxicity. Estimation of acute rat toxicity of these five vanadium complexes was performed using a freely available web-resource (http://way2drug.com/GUSAR/acutoxpredict.html). It has shown that the selected compounds belong to the class of moderate toxic pharmaceutical agents, according to the scale of Hodge and Sterner. Comparison with the predicted data has demonstrated a reasonable correspondence between the experimental and predicted values of hypoglycemic activity and toxicity. Bis{tert-butyl[amino(imino)methyl]carbamato}oxovanadium (IV) and sodium(2,2′-Bipyridyl)oxo-diperoxovanadate(V) octahydrate were identified as the most potent hypoglycemic agents among the synthesized compounds. PMID:25057899
β-Cyclodextrin inclusion complex: preparation, characterization, and its aspirin release in vitro
NASA Astrophysics Data System (ADS)
Zhou, Hui-Yun; Jiang, Ling-Juan; Zhang, Yan-Ping; Li, Jun-Bo
2012-09-01
In this work, the optimal clathration condition was investigated for the preparation of aspirin-β-cyclodextrin (Asp-β-CD) inclusion complex using design of experiment (DOE) methodology. A 3-level, 3-factor Box-Behnken design with a total of 17 experimental runs was used. The Asp-β-CD inclusion complex was prepared by saturated solution method. The influence on the embedding rate was investigated, including molar ratio of β-CD to Asp, clathration temperature and clathration time, and the optimum values of such three test variables were found to be 0.82, 49°C and 2.0 h, respectively. The embedding rate could be up to 61.19%. The formation of the bonding between -COOH group of Asp and O-H group of β-CD might play an important role in the process of clathration according to FT-IR spectra. Release kinetics of Asp from inclusion complex was studied for the evaluation of drug release mechanism and diffusion coefficients. The results showed that the drug release from matrix occurred through Fickian diffusion mechanism. The cumulative release of Asp reached only 40% over 24 h, so the inclusion complex could potentially be applied as a long-acting delivery system.
NASA Astrophysics Data System (ADS)
Marcus, Kelvin
2014-06-01
The U.S Army Research Laboratory (ARL) has built a "Network Science Research Lab" to support research that aims to improve their ability to analyze, predict, design, and govern complex systems that interweave the social/cognitive, information, and communication network genres. Researchers at ARL and the Network Science Collaborative Technology Alliance (NS-CTA), a collaborative research alliance funded by ARL, conducted experimentation to determine if automated network monitoring tools and task-aware agents deployed within an emulated tactical wireless network could potentially increase the retrieval of relevant data from heterogeneous distributed information nodes. ARL and NS-CTA required the capability to perform this experimentation over clusters of heterogeneous nodes with emulated wireless tactical networks where each node could contain different operating systems, application sets, and physical hardware attributes. Researchers utilized the Dynamically Allocated Virtual Clustering Management System (DAVC) to address each of the infrastructure support requirements necessary in conducting their experimentation. The DAVC is an experimentation infrastructure that provides the means to dynamically create, deploy, and manage virtual clusters of heterogeneous nodes within a cloud computing environment based upon resource utilization such as CPU load, available RAM and hard disk space. The DAVC uses 802.1Q Virtual LANs (VLANs) to prevent experimentation crosstalk and to allow for complex private networks. Clusters created by the DAVC system can be utilized for software development, experimentation, and integration with existing hardware and software. The goal of this paper is to explore how ARL and the NS-CTA leveraged the DAVC to create, deploy and manage multiple experimentation clusters to support their experimentation goals.
Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina
2015-03-01
Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new potentially true human protein complexes were suggested as candidates for further validation using experimental techniques. Copyright © 2015 Elsevier B.V. All rights reserved.
Single-Vector Calibration of Wind-Tunnel Force Balances
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2003-01-01
An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.
Experimental Evaluation of the Free Piston Engine - Linear Alternator (FPLA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leick, Michael T.; Moses, Ronald W.
2015-03-01
This report describes the experimental evaluation of a prototype free piston engine - linear alternator (FPLA) system developed at Sandia National Laboratories. The opposed piston design wa developed to investigate its potential for use in hybrid electric vehicles (HEVs). The system is mechanically simple with two - stroke uniflow scavenging for gas exchange and timed port fuel injection for fuel delivery, i.e. no complex valving. Electrical power is extracted from piston motion through linear alternators wh ich also provide a means for passive piston synchronization through electromagnetic coupling. In an HEV application, this electrical power would be used to chargemore » the batteries. The engine - alternator system was designed, assembled and operated over a 2 - year period at Sandia National Laboratories in Livermore, CA. This report primarily contains a description of the as - built system, modifications to the system to enable better performance, and experimental results from start - up, motoring, and hydrogen combus tion tests.« less
A practical model for pressure probe system response estimation (with review of existing models)
NASA Astrophysics Data System (ADS)
Hall, B. F.; Povey, T.
2018-04-01
The accurate estimation of the unsteady response (bandwidth) of pneumatic pressure probe systems (probe, line and transducer volume) is a common practical problem encountered in the design of aerodynamic experiments. Understanding the bandwidth of the probe system is necessary to capture unsteady flow features accurately. Where traversing probes are used, the desired traverse speed and spatial gradients in the flow dictate the minimum probe system bandwidth required to resolve the flow. Existing approaches for bandwidth estimation are either complex or inaccurate in implementation, so probes are often designed based on experience. Where probe system bandwidth is characterized, it is often done experimentally, requiring careful experimental set-up and analysis. There is a need for a relatively simple but accurate model for estimation of probe system bandwidth. A new model is presented for the accurate estimation of pressure probe bandwidth for simple probes commonly used in wind tunnel environments; experimental validation is provided. An additional, simple graphical method for air is included for convenience.
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide an engineering technology base for development of large scale hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed for conducting experimental investigations. Oxidizer (LOX or GOX) is injected through the head-end over a solid fuel (HTPB) surface. Experiments using fuels supplied by NASA designated industrial companies will also be conducted. The study focuses on the following areas: measurement and observation of solid fuel burning with LOX or GOX, correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study also being conducted at PSU.
Robust parameter design for automatically controlled systems and nanostructure synthesis
NASA Astrophysics Data System (ADS)
Dasgupta, Tirthankar
2007-12-01
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
ERIC Educational Resources Information Center
Chan, Man Ching Esther; Clarke, David; Cao, Yiming
2018-01-01
Interactive problem solving and learning are priorities in contemporary education, but these complex processes have proved difficult to research. This project addresses the question "How do we optimise social interaction for the promotion of learning in a mathematics classroom?" Employing the logic of multi-theoretic research design,…
ERIC Educational Resources Information Center
Hesse-Biber, Sharlene
2013-01-01
Some evaluators employ randomized controlled trials (RCTs) as the gold standard of evidence-based practice (EBP). Critics of RCT designs argue that RCTs do not include the complexity of program participants' experiences or clinical expertise, and couple this with criticisms that it is difficult to transfer RCT findings from the laboratory to…
H.T. Odum and the Luquillo Experimental Forest
Ariel E. Lugo
2004-01-01
How does the forest operate, develop its patterns, retain information in its memory sites, and transmit the great message to the future? (Odum, 1970a, p. x) The rain forest achieves complexity, high metabolism, and stability over geological time periods without surges and waste. Can we find in this example the clues for designing our own equally effective systems of...
ERIC Educational Resources Information Center
Plimmer, Geoff
2012-01-01
This study examined the effectiveness of an adult career development program designed to reflect the diversity and demands of career choices, the low level of comfort many have with career choices, and the limited resources available to resolve complex adult career problems. A possible selves process was used, delivered through a blend of computer…
ERIC Educational Resources Information Center
Valenti, Elizabeth C.
2013-01-01
College dropout is a complex problem resulting in an array of negative repercussions for students, universities, and society. The study explored the impact of reading proficiency on academic success in a college-level introductory psychology course offered in both traditional and accelerated formats. A quantitative, quasi-experimental design was…
ERIC Educational Resources Information Center
Stark, Robin; Kopp, Veronika; Fischer, Martin R.
2011-01-01
To investigate the effects of example format (erroneous examples vs. correct examples) and feedback format (elaborated feedback vs. knowledge of results feedback) on medical students' diagnostic competence in the context of a web-based learning environment containing case-based worked examples, two studies with a 2 x 2 design were conducted in the…
NASA Astrophysics Data System (ADS)
Rusu-Anghel, S.
2017-01-01
Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.
Present understanding of MHD and heat transfer phenomena for liquid metal blankets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirillov, I.R.; Barleon, L.; Reed, C.B.
1994-12-31
Liquid metals (Li, Li17Pb83, Pb) are considered as coolants in many designs of fusion reactor blankets. To estimate their potential and to make an optimal design, one has to know the magnetohydrodynamic (MHD) and heat transfer characteristics of liquid metal flow in the magnetic field. Such flows with high characteristic parameter values (Hartmann number M and interaction parameter N) open up a relatively new field in Magnetohydrodynamics requiring both theoretical and experimental efforts. A review of experimental work done for the last ten years in different countries shows that there are some data on MHD/HT characteristics in straight channels ofmore » simple geometry under fusion reactor relevant conditions (M>>1, N>>1) and not enough data for complex flow geometries. Future efforts should be directed to investigation of MHD/HT in straight channels with perfect and imperfect electroinsulated walls, including those with controlled imperfections, and in channels of complex geometry. The experiments are not simple, since the fusion relevant conditions require facilities with magnetic fields at, or even higher than, 5-7 T in comparatively large volumes. International cooperation in constructing and operating these facilities may be of great help.« less
Development of a complex experimental system for controlled ecological life support technique
NASA Astrophysics Data System (ADS)
Guo, S.; Tang, Y.; Zhu, J.; Wang, X.; Feng, H.; Ai, W.; Qin, L.; Deng, Y.
A complex experimental system for controlled ecological life support technique can be used as a test platform for plant-man integrated experiments and material close-loop experiments of the controlled ecological life support system CELSS Based on lots of plan investigation plan design and drawing design the system was built through the steps of processing installation and joined debugging The system contains a volume of about 40 0m 3 its interior atmospheric parameters such as temperature relative humidity oxygen concentration carbon dioxide concentration total pressure lighting intensity photoperiod water content in the growing-matrix and ethylene concentration are all monitored and controlled automatically and effectively Its growing system consists of two rows of racks along its left-and-right sides separately and each of which holds two up-and-down layers eight growing beds hold a total area of about 8 4m 2 and their vertical distance can be adjusted automatically and independently lighting sources consist of both red and blue light-emitting diodes Successful development of the test platform will necessarily create an essential condition for next large-scale integrated study of controlled ecological life support technique
Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong
2017-02-01
Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.
NASA Technical Reports Server (NTRS)
Okhio, Cyril B.
1996-01-01
A theoretical and an experimental design study of subsonic flow through curved-wall annular diffusers has been initiated under this award in order to establish the most pertinent design parameters and hence performance characteristics for such devices, an the implications of their application in the design of engine components in the aerospace industries. The diffusers under this study are expected to contain appreciable regions of stall and the effects of swirl on their performance are being studied. The experimental work involves the application of Computer Aided Design software tool to the development of a suitable annular diffuse geometry and the subsequent downloading of such data to a CNC machine at Central State University (CSU). Two experimental run segments have been completed so far during FY-95 involving flow visualization and diffuser performance evaluation based on Kinetic Energy Dissipation. The method of calculation of the performance of diffusers based on pressure recovery coefficient has been shown to have some shortcomings and so the kinetic energy dissipation approach has been introduced in the run segment two with some success. The application of the discretized, full Navier Stokes and Continuity equations to the numerical study of the problem described above for the time-mean flow is expected to follow. Various models of turbulence are being evaluated for adoption throughout the study and comparisons would be made with experimental data where they exist. Assessment of diffuser performance based on the dissipated mechanical energy would also be made. The result of the investigations are expected to indicate that more cost effective component design of such devices as diffusers which normally contain complex flows can still be achieved.
Trickey, Heather; Thomson, Gill; Grant, Aimee; Sanders, Julia; Mann, Mala; Murphy, Simon; Paranjothy, Shantini
2018-01-01
The World Health Organisation guidance recommends breastfeeding peer support (BFPS) as part of a strategy to improve breastfeeding rates. In the UK, BFPS is supported by National Institute for Health and Care Excellence guidance and a variety of models are in use. The experimental evidence for BFPS in developed countries is mixed and traditional methods of systematic review are ill-equipped to explore heterogeneity, complexity, and context influences on effectiveness. This review aimed to enhance learning from the experimental evidence base for one-to-one BFPS intervention. Principles of realist review were applied to intervention case studies associated with published experimental studies. The review aimed (a) to explore heterogeneity in theoretical underpinnings and intervention design for one-to-one BFPS intervention; (b) inform design decisions by identifying transferable lessons developed from cross-case comparison of context-mechanism-outcome relationships; and (c) inform evaluation design by identifying context-mechanism-outcome relationships associated with experimental conditions. Findings highlighted poor attention to intervention theory and considerable heterogeneity in BFPS intervention design. Transferable mid-range theories to inform design emerged, which could be grouped into seven categories: (a) congruence with local infant feeding norms, (b) integration with the existing system of health care, (c) overcoming practical and emotional barriers to access, (d) ensuring friendly, competent, and proactive peers, (e) facilitating authentic peer-mother interactions, (f) motivating peers to ensure positive within-intervention amplification, and (g) ensuring positive legacy and maintenance of gains. There is a need to integrate realist principles into evaluation design to improve our understanding of what forms of BFPS work, for whom and under what circumstances. © 2017 John Wiley & Sons Ltd.
Reducing cognitive load while teaching complex instruction to occupational therapy students.
Pociask, Fredrick D; DiZazzo-Miller, Rosanne; Samuel, Preethy S
2013-01-01
Cognitive load theory is a field of research used to improve the learning of complex cognitive tasks by matching instruction to the learner's cognitive architecture. We used an experimental posttest control-group design to test the effectiveness of instruction designed to reduce cognitive load (CL) and improve instructional effectiveness in teaching complex instruction to 24 first-year master's students under authentic classroom conditions. We modified historically taught instruction using an isolated-to-interacting-elements sequencing approach intended to reduce high CL levels. We compared control and modified instructional formats using written assessment scores, subjective ratings of CL, and task completion times. Analysis of variance revealed significant differences for postinstruction, posttest CL ratings, and delayed written posttest scores (p < .05). No significant differences were identified for posttest completion times. Findings suggest that this approach can be used to improve instructional efficiency in teaching human locomotion to occupational therapy students. Copyright © 2013 by the American Occupational Therapy Association, Inc.
Venkata Mohan, S; Chandrasekhara Rao, N; Krishna Prasad, K; Murali Krishna, P; Sreenivas Rao, R; Sarma, P N
2005-06-20
The Taguchi robust experimental design (DOE) methodology has been applied on a dynamic anaerobic process treating complex wastewater by an anaerobic sequencing batch biofilm reactor (AnSBBR). For optimizing the process as well as to evaluate the influence of different factors on the process, the uncontrollable (noise) factors have been considered. The Taguchi methodology adopting dynamic approach is the first of its kind for studying anaerobic process evaluation and process optimization. The designed experimental methodology consisted of four phases--planning, conducting, analysis, and validation connected sequence-wise to achieve the overall optimization. In the experimental design, five controllable factors, i.e., organic loading rate (OLR), inlet pH, biodegradability (BOD/COD ratio), temperature, and sulfate concentration, along with the two uncontrollable (noise) factors, volatile fatty acids (VFA) and alkalinity at two levels were considered for optimization of the anae robic system. Thirty-two anaerobic experiments were conducted with a different combination of factors and the results obtained in terms of substrate degradation rates were processed in Qualitek-4 software to study the main effect of individual factors, interaction between the individual factors, and signal-to-noise (S/N) ratio analysis. Attempts were also made to achieve optimum conditions. Studies on the influence of individual factors on process performance revealed the intensive effect of OLR. In multiple factor interaction studies, biodegradability with other factors, such as temperature, pH, and sulfate have shown maximum influence over the process performance. The optimum conditions for the efficient performance of the anaerobic system in treating complex wastewater by considering dynamic (noise) factors obtained are higher organic loading rate of 3.5 Kg COD/m3 day, neutral pH with high biodegradability (BOD/COD ratio of 0.5), along with mesophilic temperature range (40 degrees C), and low sulfate concentration (700 mg/L). The optimization resulted in enhanced anaerobic performance (56.7%) from a substrate degradation rate (SDR) of 1.99 to 3.13 Kg COD/m3 day. Considering the obtained optimum factors, further validation experiments were carried out, which showed enhanced process performance (3.04 Kg COD/m3-day from 1.99 Kg COD/m3 day) accounting for 52.13% improvement with the optimized process conditions. The proposed method facilitated a systematic mathematical approach to understand the complex multi-species manifested anaerobic process treating complex chemical wastewater by considering the uncontrollable factors. Copyright (c) 2005 Wiley Periodicals, Inc.
3D-glass molds for facile production of complex droplet microfluidic chips.
Tovar, Miguel; Weber, Thomas; Hengoju, Sundar; Lovera, Andrea; Munser, Anne-Sophie; Shvydkiv, Oksana; Roth, Martin
2018-03-01
In order to leverage the immense potential of droplet microfluidics, it is necessary to simplify the process of chip design and fabrication. While polydimethylsiloxane (PDMS) replica molding has greatly revolutionized the chip-production process, its dependence on 2D-limited photolithography has restricted the design possibilities, as well as further dissemination of microfluidics to non-specialized labs. To break free from these restrictions while keeping fabrication straighforward, we introduce an approach to produce complex multi-height (3D) droplet microfluidic glass molds and subsequent chip production by PDMS replica molding. The glass molds are fabricated with sub-micrometric resolution using femtosecond laser machining technology, which allows directly realizing designs with multiple levels or even continuously changing heights. The presented technique significantly expands the experimental capabilities of the droplet microfluidic chip. It allows direct fabrication of multilevel structures such as droplet traps for prolonged observation and optical fiber integration for fluorescence detection. Furthermore, the fabrication of novel structures based on sloped channels (ramps) enables improved droplet reinjection and picoinjection or even a multi-parallelized drop generator based on gradients of confinement. The fabrication of these and other 3D-features is currently only available at such resolution by the presented strategy. Together with the simplicity of PDMS replica molding, this provides an accessible solution for both specialized and non-specialized labs to customize microfluidic experimentation and expand their possibilities.
Design and Initial Characterization of the SC-200 Proteomics Standard Mixture
Bauman, Andrew; Higdon, Roger; Rapson, Sean; Loiue, Brenton; Hogan, Jason; Stacy, Robin; Napuli, Alberto; Guo, Wenjin; van Voorhis, Wesley; Roach, Jared; Lu, Vincent; Landorf, Elizabeth; Stewart, Elizabeth; Kolker, Natali; Collart, Frank; Myler, Peter; van Belle, Gerald
2011-01-01
Abstract High-throughput (HTP) proteomics studies generate large amounts of data. Interpretation of these data requires effective approaches to distinguish noise from biological signal, particularly as instrument and computational capacity increase and studies become more complex. Resolving this issue requires validated and reproducible methods and models, which in turn requires complex experimental and computational standards. The absence of appropriate standards and data sets for validating experimental and computational workflows hinders the development of HTP proteomics methods. Most protein standards are simple mixtures of proteins or peptides, or undercharacterized reference standards in which the identity and concentration of the constituent proteins is unknown. The Seattle Children's 200 (SC-200) proposed proteomics standard mixture is the next step toward developing realistic, fully characterized HTP proteomics standards. The SC-200 exhibits a unique modular design to extend its functionality, and consists of 200 proteins of known identities and molar concentrations from 6 microbial genomes, distributed into 10 molar concentration tiers spanning a 1,000-fold range. We describe the SC-200's design, potential uses, and initial characterization. We identified 84% of SC-200 proteins with an LTQ-Orbitrap and 65% with an LTQ-Velos (false discovery rate = 1% for both). There were obvious trends in success rate, sequence coverage, and spectral counts with protein concentration; however, protein identification, sequence coverage, and spectral counts vary greatly within concentration levels. PMID:21250827
Design and initial characterization of the SC-200 proteomics standard mixture.
Bauman, Andrew; Higdon, Roger; Rapson, Sean; Loiue, Brenton; Hogan, Jason; Stacy, Robin; Napuli, Alberto; Guo, Wenjin; van Voorhis, Wesley; Roach, Jared; Lu, Vincent; Landorf, Elizabeth; Stewart, Elizabeth; Kolker, Natali; Collart, Frank; Myler, Peter; van Belle, Gerald; Kolker, Eugene
2011-01-01
High-throughput (HTP) proteomics studies generate large amounts of data. Interpretation of these data requires effective approaches to distinguish noise from biological signal, particularly as instrument and computational capacity increase and studies become more complex. Resolving this issue requires validated and reproducible methods and models, which in turn requires complex experimental and computational standards. The absence of appropriate standards and data sets for validating experimental and computational workflows hinders the development of HTP proteomics methods. Most protein standards are simple mixtures of proteins or peptides, or undercharacterized reference standards in which the identity and concentration of the constituent proteins is unknown. The Seattle Children's 200 (SC-200) proposed proteomics standard mixture is the next step toward developing realistic, fully characterized HTP proteomics standards. The SC-200 exhibits a unique modular design to extend its functionality, and consists of 200 proteins of known identities and molar concentrations from 6 microbial genomes, distributed into 10 molar concentration tiers spanning a 1,000-fold range. We describe the SC-200's design, potential uses, and initial characterization. We identified 84% of SC-200 proteins with an LTQ-Orbitrap and 65% with an LTQ-Velos (false discovery rate = 1% for both). There were obvious trends in success rate, sequence coverage, and spectral counts with protein concentration; however, protein identification, sequence coverage, and spectral counts vary greatly within concentration levels.
Rosier, Bas J. H. M.; Cremers, Glenn A. O.; Engelen, Wouter; Merkx, Maarten; Brunsveld, Luc
2017-01-01
A photocrosslinkable protein G variant was used as an adapter protein to covalently and site-specifically conjugate an antibody and an Fc-fusion protein to an oligonucleotide. This modular approach enables straightforward decoration of DNA nanostructures with complex native proteins while retaining their innate binding affinity, allowing precise control over the nanoscale spatial organization of such proteins for in vitro and in vivo biomedical applications. PMID:28617516
Techniques for video compression
NASA Technical Reports Server (NTRS)
Wu, Chwan-Hwa
1995-01-01
In this report, we present our study on multiprocessor implementation of a MPEG2 encoding algorithm. First, we compare two approaches to implementing video standards, VLSI technology and multiprocessor processing, in terms of design complexity, applications, and cost. Then we evaluate the functional modules of MPEG2 encoding process in terms of their computation time. Two crucial modules are identified based on this evaluation. Then we present our experimental study on the multiprocessor implementation of the two crucial modules. Data partitioning is used for job assignment. Experimental results show that high speedup ratio and good scalability can be achieved by using this kind of job assignment strategy.
Zhao, Xi; Wu, Xiaoli; Zhou, Hui; Jiang, Tao; Chen, Chun; Liu, Mingshi; Jin, Yuanbao; Yang, Dongsheng
2014-11-01
To optimize the preparation factors for argan oil microcapsule using complex coacervation of chitosan cross-linked with gelatin based on hybrid-level orthogonal array design via SPSS modeling. Eight relatively significant factors were firstly investigated and selected as calculative factors for the orthogonal array design from the total of ten factors effecting the preparation of argan oil microcapsule by utilizing the single factor variable method. The modeling of hybrid-level orthogonal array design was built in these eight factors with the relevant levels (9, 9, 9, 9, 7, 6, 2 and 2 respectively). The preparation factors for argan oil microcapsule were investigated and optimized according to the results of hybrid-level orthogonal array design. The priorities order and relevant optimum levels of preparation factors standard to base on the percentage of microcapsule with the diameter of 30~40 μm via SPSS. Experimental data showed that the optimum factors were controlling the chitosan/gelatin ratio, the systemic concentration and the core/shell ratio at 1:2, 1.5% and 1:7 respectively, presetting complex coacervation pH at 6.4, setting cross-linking time and complex coacervation at 75 min and 30 min, using the glucose-delta lactone as the type of cross-linking agent, and selecting chitosan with the molecular weight of 2000~3000.
Interactive orbital proximity operations planning system
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Ellis, Stephen R.
1989-01-01
An interactive, graphical proximity operations planning system was developed which allows on-site design of efficient, complex, multiburn maneuvers in the dynamic multispacecraft environment about the space station. Maneuvering takes place in, as well as out of, the orbital plane. The difficulty in planning such missions results from the unusual and counterintuitive character of relative orbital motion trajectories and complex operational constraints, which are both time varying and highly dependent on the mission scenario. This difficulty is greatly overcome by visualizing the relative trajectories and the relative constraints in an easily interpretable, graphical format, which provides the operator with immediate feedback on design actions. The display shows a perspective bird's-eye view of the space station and co-orbiting spacecraft on the background of the station's orbital plane. The operator has control over two modes of operation: (1) a viewing system mode, which enables him or her to explore the spatial situation about the space station and thus choose and frame in on areas of interest; and (2) a trajectory design mode, which allows the interactive editing of a series of way-points and maneuvering burns to obtain a trajectory which complies with all operational constraints. Through a graphical interactive process, the operator will continue to modify the trajectory design until all operational constraints are met. The effectiveness of this display format in complex trajectory design is presently being evaluated in an ongoing experimental program.
Considerations on the construction of a Powder Bed Fusion platform for Additive Manufacturing
NASA Astrophysics Data System (ADS)
Andersen, Sebastian Aagaard; Nielsen, Karl-Emil; Pedersen, David Bue; Nielsen, Jakob Skov
As the demand for moulds and other tools becomes increasingly specific and complex, an additive manufacturing approach to production is making its way to the industry through laser based consolidation of metal powder particles by a method known as powder bed fusion. This paper concerns a variety of design choices facilitating the development of an experimental powder bed fusion machine tool, capable of manufacturing metal parts with strength matching that of conventional manufactured parts and a complexity surpassing that of subtractive processes. To understand the different mechanisms acting within such an experimental machine tool, a fully open and customizable rig is constructed. Emphasizing modularity in the rig, allows alternation of lasers, scanner systems, optical elements, powder deposition, layer height, temperature, atmosphere, and powder type. Through a custom-made software platform, control of the process is achieved, which extends into a graphical user interface, easing adjustment of process parameters and the job file generation.
NASA Astrophysics Data System (ADS)
González-Montiel, Simplicio; Valdez-Calderón, Alejandro; Vásquez-Pérez, J. Manuel; Torres-Valencia, J. Martín; Martínez-Otero, Diego; López, Jorge A.; Cruz-Borbolla, Julián
2017-10-01
A new series of chrysin derivatives containing the di-(2-picolyl)amine (2a-d) moiety have been designed, synthesized, and treated with PdCl2·2CH3CN allowing the preparation of new cationic Palladium(II) complexes (3a-d). Solution-phase studies by 1H NMR spectroscopy of 3a-d revealed that the protons of the methylene groups of the di(2-picolyl)amine fragment are diasterotopic. GIAO/DFT studies were performed to predict the molecular structures of 3a-d by comparing the experimental and theoretical 1H-NMR chemical shifts. The molecular structure of 3c was determined by X-ray crystallographic analysis revealing that di-(2-picolyl)amine fragment is coordinated to the palladium center in a κ3-N,N,N-tridentate fashion in an overall square-planar geometry completed with a chloride atom.
Molecular propulsion: chemical sensing and chemotaxis of DNA driven by RNA polymerase.
Yu, Hua; Jo, Kyubong; Kounovsky, Kristy L; de Pablo, Juan J; Schwartz, David C
2009-04-29
Living cells sense extracellular signals and direct their movements in response to stimuli in environment. Such autonomous movement allows these machines to sample chemical change over a distance, leading to chemotaxis. Synthetic catalytic rods have been reported to chemotax toward hydrogen peroxide fuel. Nevertheless individualized autonomous control of movement of a population of biomolecules under physiological conditions has not been demonstrated. Here we show the first experimental evidence that a molecular complex consisting of a DNA template and associating RNA polymerases (RNAPs) displays chemokinetic motion driven by transcription substrates nucleoside triphosphates (NTPs). Furthermore this molecular complex exhibits a biased migration into a concentration gradient of NTPs, resembling chemotaxis. We describe this behavior as "Molecular Propulsion", in which RNAP transcriptional actions deform DNA template conformation engendering measurable enhancement of motility. Our results provide new opportunities for designing and directing nanomachines by imposing external triggers within an experimental system.
NASA Technical Reports Server (NTRS)
Boelens, Okko J.; Luckring, James M.; Breitsamter, Christian; Hovelmann, Andreas; Knoth, Florian; Malloy, Donald J.; Deck, Sebatien
2015-01-01
A diamond-wing configuration has been developed to isolate and study blunt-leading edge vortex separation with both computations and experiments. The wing has been designed so that the results are relevant to a more complex Uninhabited Combat Air Vehicle concept known as SACCON. The numerical and theoretical development process for this diamond wing is presented, including a view toward planned wind tunnel experiments. This work was conducted under the NATO Science and Technology Organization, Applied Vehicle Technology panel. All information is in the public domain.
NASA Astrophysics Data System (ADS)
Grinyok, A.; Boychuk, I.; Perelygin, D.; Dantsevich, I.
2018-03-01
A complex method of the simulation and production design of open rotor propellers was studied. An end-to-end diagram was proposed for the evaluating, designing and experimental testing the optimal geometry of the propeller surface, for the machine control path generation as well as for simulating the cutting zone force condition and its relationship with the treatment accuracy which was defined by the propeller elastic deformation. The simulation data provided the realization of the combined automated path control of the cutting tool.
Theoretical and experimental study of flow-control devices for inlets of indraft wind tunnels
NASA Technical Reports Server (NTRS)
Ross, James C.
1989-01-01
The design of closed circuit wind tunnels has historically been performed using rule of thumb which have evolved over the years into a body of useful guidelines. The development of indraft wind tunnels, however, has not been as well documented. The design of indraft wind tunnels is therefore generally performed using a more intuitive approach, often resulting in a facility with disappointing flow quality. The primary problem is a lack of understanding of the flow in the inlet as it passes through the required antiturbulence treatment. For wind tunnels which employ large contraction ratio inlets, this lack of understanding is not serious since the relatively low velocity of the flow through the inlet treatment reduces the sensitivity to improper inlet design. When designing a small contraction ratio inlet, much more careful design is needed in order to reduce the flow distortions generated by the inlet treatment. As part of the National Full Scale Aerodynamics Complex Modification Project, 2-D computational methods were developed which account for the effect of both inlet screens and guide vanes on the test section velocity distribution. Comparisons with experimental data are presented which indicate that the methods accurately compute the flow distortions generated by a screen in a nonuniform velocity field. The use of inlet guide vanes to eliminate the screen induced distortion is also demonstrated both computationally and experimentally. Extensions of the results to 3-D is demonstrated and a successful wind tunnel design is presented.
Experimental investigation of solid rocket motors for small sounding rockets
NASA Astrophysics Data System (ADS)
Suksila, Thada
2018-01-01
Experimentation and research of solid rocket motors are important subjects for aerospace engineering students. However, many institutes in Thailand rarely include experiments on solid rocket motors in research projects of aerospace engineering students, mainly because of the complexity of mixing the explosive propellants. This paper focuses on the design and construction of a solid rocket motor for total impulse in the class I-J that can be utilised as a small sounding rocket by researchers in the near future. Initially, the test stands intended for measuring the pressure in the combustion chamber and the thrust of the solid rocket motor were designed and constructed. The basic design of the propellant configuration was evaluated. Several formulas and ratios of solid propellants were compared for achieving the maximum thrust. The convenience of manufacturing and casting of the fabricated solid rocket motors were a critical consideration. The motor structural analysis such as the combustion chamber wall thickness was also discussed. Several types of nozzles were compared and evaluated for ensuring the maximum thrust of the solid rocket motors during the experiments. The theory of heat transfer analysis in the combustion chamber was discussed and compared with the experimental data.
Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.
Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian
2018-05-08
Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.
Medical Image Compression Using a New Subband Coding Method
NASA Technical Reports Server (NTRS)
Kossentini, Faouzi; Smith, Mark J. T.; Scales, Allen; Tucker, Doug
1995-01-01
A recently introduced iterative complexity- and entropy-constrained subband quantization design algorithm is generalized and applied to medical image compression. In particular, the corresponding subband coder is used to encode Computed Tomography (CT) axial slice head images, where statistical dependencies between neighboring image subbands are exploited. Inter-slice conditioning is also employed for further improvements in compression performance. The subband coder features many advantages such as relatively low complexity and operation over a very wide range of bit rates. Experimental results demonstrate that the performance of the new subband coder is relatively good, both objectively and subjectively.
NASA Astrophysics Data System (ADS)
Yousef, T. A.; Abu El-Reash, G. M.; El Morshedy, R. M.
2013-08-01
The paper presents a combined experimental and computational study of novel Cr(III), Fe(III), Co(II), Hg(II) and U(VI) complexes of (E)-2-((3-hydroxynaphthalen-2-yl)methylene)-N-(pyridin-2-yl)hydrazinecarbothioamide (H2L). The ligand and its complexes have been characterized by elemental analyses, spectral (IR, UV-vis, 1H NMR and 13C NMR), magnetic and thermal studies. IR spectra show that H2L is coordinated to the metal ions in a mononegative bi or tri manner. The structures are suggested to be octahedral for all complexes except Hg(II) complex is tetrahedral. Theoretical calculations have been performed to obtain IR spectra of ligand and its complexes using AM1, MM, Zindo/1, MM+ and PM3, methods. Satisfactory theoretical-experimental agreements were achieved by MM method for the ligand and PM3 for its complexes. DOS calculations carried out by MM (ADF) method for ligand Hg complex from which we concluded that the thiol form of the ligand is more active than thione form and this explains that the most complexation take place in that form. The calculated IR vibrations of the metal complexes, using the PM3 method was the nearest method for the experimental data, and it could be used for all complexes. Also, valuable information are obtained from calculation of molecular parameters for all compounds carried out by the previous methods of calculation (electronegativity of the coordination sites, net dipole moment of the metal complexes, values of heat of formation and binding energy) which approved that the complexes are more stable than ligand. The low value of ΔE could be expected to indicate H2L molecule has high inclination to bind with the metal ions. Furthermore, the kinetic and thermodynamic parameters for the different decomposition steps were calculated using the Coats-Redfern and Horowitz-Metzger methods. Finally, the biochemical studies showed that, complex 2, 4 have powerful and complete degradation effect on DNA. For the foremost majority of cases the activity of the ligand is greatly enhanced by the presence of a metal ion. Thus presented results may be useful in design new more active or specific structures.
Sky-blue emitting bridged diiridium complexes: beneficial effects of intramolecular π-π stacking.
Congrave, Daniel G; Hsu, Yu-Ting; Batsanov, Andrei S; Beeby, Andrew; Bryce, Martin R
2018-02-06
The potential of intramolecular π-π interactions to influence the photophysical properties of diiridium complexes is an unexplored topic, and provides the motivation for the present study. A series of diarylhydrazide-bridged diiridium complexes functionalised with phenylpyridine (ppy)-based cyclometalating ligands is reported. It is shown by NMR studies in solution and single crystal X-ray analysis that intramolecular π-π interactions between the bridging and cyclometalating ligands rigidify the complexes leading to high luminescence quantum efficiencies in solution and in doped films. Fluorine substituents on the phenyl rings of the bridge promote the intramolecular π-π interactions. Notably, these non-covalent interactions are harnessed in the rational design and synthesis of the first examples of highly emissive sky-blue diiridium complexes featuring conjugated bridging ligands, for which they play a vital role in the structural and photophysical properties. Experimental results are supported by computational studies.
MAJIQ-SPEL: Web-tool to interrogate classical and complex splicing variations from RNA-Seq data.
Green, Christopher J; Gazzara, Matthew R; Barash, Yoseph
2017-09-11
Analysis of RNA sequencing (RNA-Seq) data have highlighted the fact that most genes undergo alternative splicing (AS) and that these patterns are tightly regulated. Many of these events are complex, resulting in numerous possible isoforms that quickly become difficult to visualize, interpret, and experimentally validate. To address these challenges we developed MAJIQ-SPEL, a web-tool that takes as input local splicing variations (LSVs) quantified from RNA-Seq data and provides users with visualization and quantification of gene isoforms associated with those. Importantly, MAJIQ-SPEL is able to handle both classical (binary) and complex, non-binary, splicing variations. Using a matching primer design algorithm it also suggests users possible primers for experimental validation by RT-PCR and displays those, along with the matching protein domains affected by the LSV, on UCSC Genome Browser for further downstream analysis. Program and code will be available at http://majiq.biociphers.org/majiq-spel. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Moreira, I. S.; Fernandes, P. A.; Ramos, M. J.
The definition and comprehension of the hot spots in an interface is a subject of primary interest for a variety of fields, including structure-based drug design. Therefore, to achieve an alanine mutagenesis computational approach that is at the same time accurate and predictive, capable of reproducing the experimental mutagenesis values is a major challenge in the computational biochemistry field. Antibody/protein antigen complexes provide one of the greatest models to study protein-protein recognition process because they have three fundamentally features: specificity, high complementary association and a small epitope restricted to the diminutive complementary determining regions (CDR) region, while the remainder of the antibody is largely invariant. Thus, we apply a computational mutational methodological approach to the study of the antigen-antibody complex formed between the hen egg white lysozyme (HEL) and the antibody HyHEL-10. A critical evaluation that focuses essentially on the limitations and advantages between different computational methods for hot spot determination, as well as between experimental and computational methodological approaches, is presented.
Silva, Camilla Fonseca; Borges, Keyller Bastos; do Nascimento, Clebio Soares
2017-12-18
In this work, we studied theoretically the formation process of a molecularly imprinted polymer (MIP) for dinotefuran (DNF), testing distinct functional monomers (FM) in various solvents through density functional theory calculations. The results revealed that the best conditions for MIP synthesis were established with methacrylic acid (MAA) as FM in a 1 : 4 stoichiometry and with chloroform as the solvent. This protocol showed the most favourable stabilization energies for the pre-polymerization complexes. Furthermore, the formation of the FM/template complex is enthalpy driven and the occurrence of hydrogen bonds between the DNF and MAA plays a major role in the complex stability. To confirm the theoretical results, MIP was experimentally synthesized considering the best conditions found at the molecular level and characterized by scanning electron microscopy and thermogravimetric analysis. After that, the synthesized material was efficiently employed in microextraction by packed sorbent combined with high-performance liquid chromatography in a preliminary study of the recovery of DNF from water and artificial saliva samples.
Neural control of magnetic suspension systems
NASA Technical Reports Server (NTRS)
Gray, W. Steven
1993-01-01
The purpose of this research program is to design, build and test (in cooperation with NASA personnel from the NASA Langley Research Center) neural controllers for two different small air-gap magnetic suspension systems. The general objective of the program is to study neural network architectures for the purpose of control in an experimental setting and to demonstrate the feasibility of the concept. The specific objectives of the research program are: (1) to demonstrate through simulation and experimentation the feasibility of using neural controllers to stabilize a nonlinear magnetic suspension system; (2) to investigate through simulation and experimentation the performance of neural controllers designs under various types of parametric and nonparametric uncertainty; (3) to investigate through simulation and experimentation various types of neural architectures for real-time control with respect to performance and complexity; and (4) to benchmark in an experimental setting the performance of neural controllers against other types of existing linear and nonlinear compensator designs. To date, the first one-dimensional, small air-gap magnetic suspension system has been built, tested and delivered to the NASA Langley Research Center. The device is currently being stabilized with a digital linear phase-lead controller. The neural controller hardware is under construction. Two different neural network paradigms are under consideration, one based on hidden layer feedforward networks trained via back propagation and one based on using Gaussian radial basis functions trained by analytical methods related to stability conditions. Some advanced nonlinear control algorithms using feedback linearization and sliding mode control are in simulation studies.
Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen
2011-01-01
Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.
Advanced Supersonic Nozzle Concepts: Experimental Flow Visualization Results Paired With LES
NASA Astrophysics Data System (ADS)
Berry, Matthew; Magstadt, Andrew; Stack, Cory; Gaitonde, Datta; Glauser, Mark; Syracuse University Team; The Ohio State University Team
2015-11-01
Advanced supersonic nozzle concepts are currently under investigation, utilizing multiple bypass streams and airframe integration to bolster performance and efficiency. This work focuses on the parametric study of a supersonic, multi-stream jet with aft deck. The single plane of symmetry, rectangular nozzle, displays very complex and unique flow characteristics. Flow visualization techniques in the form of PIV and schlieren capture flow features at various deck lengths and Mach numbers. LES is compared to the experimental results to both validate the computational model and identify limitations of the simulation. By comparing experimental results to LES, this study will help create a foundation of knowledge for advanced nozzle designs in future aircraft. SBIR Phase II with Spectral Energies, LLC under direction of Barry Kiel.
Experimental validation of predicted cancer genes using FRET
NASA Astrophysics Data System (ADS)
Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.
2018-07-01
Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.
Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J
2018-01-01
There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hajdukiewicz, John R; Vicente, Kim J
2002-01-01
Ecological interface design (EID) is a theoretical framework that aims to support worker adaptation to change and novelty in complex systems. Previous evaluations of EID have emphasized representativeness to enhance generalizability of results to operational settings. The research presented here is complementary, emphasizing experimental control to enhance theory building. Two experiments were conducted to test the impact of functional information and emergent feature graphics on adaptation to novelty and change in a thermal-hydraulic process control microworld. Presenting functional information in an interface using emergent features encouraged experienced participants to become perceptually coupled to the interface and thereby to exhibit higher-level control and more successful adaptation to unanticipated events. The absence of functional information or of emergent features generally led to lower-level control and less success at adaptation, the exception being a minority of participants who compensated by relying on analytical reasoning. These findings may have practical implications for shaping coordination in complex systems and fundamental implications for the development of a general unified theory of coordination for the technical, human, and social sciences. Actual or potential applications of this research include the design of human-computer interfaces that improve safety in complex sociotechnical systems.
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Chen, Xiangyang; Jing, Yuanyuan; Yang, Xinzheng
2016-06-20
Inspired by the active-site structure of the [NiFe] hydrogenase, we have computationally designed the iron complex [P(tBu) 2 N(tBu) 2 )Fe(CN)2 CO] by using an experimentally ready-made diphosphine ligand with pendant amines for the hydrogenation of CO2 to methanol. Density functional theory calculations indicate that the rate-determining step in the whole catalytic reaction is the direct hydride transfer from the Fe center to the carbon atom in the formic acid with a total free energy barrier of 28.4 kcal mol(-1) in aqueous solution. Such a barrier indicates that the designed iron complex is a promising low-cost catalyst for the formation of methanol from CO2 and H2 under mild conditions. The key role of the diphosphine ligand with pendent amine groups in the reaction is the assistance of the cleavage of H2 by forming a Fe-H(δ-) ⋅⋅⋅H(δ+) -N dihydrogen bond in a fashion of frustrated Lewis pairs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An experimental design method leading to chemical Turing patterns.
Horváth, Judit; Szalai, István; De Kepper, Patrick
2009-05-08
Chemical reaction-diffusion patterns often serve as prototypes for pattern formation in living systems, but only two isothermal single-phase reaction systems have produced sustained stationary reaction-diffusion patterns so far. We designed an experimental method to search for additional systems on the basis of three steps: (i) generate spatial bistability by operating autoactivated reactions in open spatial reactors; (ii) use an independent negative-feedback species to produce spatiotemporal oscillations; and (iii) induce a space-scale separation of the activatory and inhibitory processes with a low-mobility complexing agent. We successfully applied this method to a hydrogen-ion autoactivated reaction, the thiourea-iodate-sulfite (TuIS) reaction, and noticeably produced stationary hexagonal arrays of spots and parallel stripes of pH patterns attributed to a Turing bifurcation. This method could be extended to biochemical reactions.
NASA Technical Reports Server (NTRS)
Allen, B. Danette; Alexandrov, Natalia
2016-01-01
Incremental approaches to air transportation system development inherit current architectural constraints, which, in turn, place hard bounds on system capacity, efficiency of performance, and complexity. To enable airspace operations of the future, a clean-slate (ab initio) airspace design(s) must be considered. This ab initio National Airspace System (NAS) must be capable of accommodating increased traffic density, a broader diversity of aircraft, and on-demand mobility. System and subsystem designs should scale to accommodate the inevitable demand for airspace services that include large numbers of autonomous Unmanned Aerial Vehicles and a paradigm shift in general aviation (e.g., personal air vehicles) in addition to more traditional aerial vehicles such as commercial jetliners and weather balloons. The complex and adaptive nature of ab initio designs for the future NAS requires new approaches to validation, adding a significant physical experimentation component to analytical and simulation tools. In addition to software modeling and simulation, the ability to exercise system solutions in a flight environment will be an essential aspect of validation. The NASA Langley Research Center (LaRC) Autonomy Incubator seeks to develop a flight simulation infrastructure for ab initio modeling and simulation that assumes no specific NAS architecture and models vehicle-to-vehicle behavior to examine interactions and emergent behaviors among hundreds of intelligent aerial agents exhibiting collaborative, cooperative, coordinative, selfish, and malicious behaviors. The air transportation system of the future will be a complex adaptive system (CAS) characterized by complex and sometimes unpredictable (or unpredicted) behaviors that result from temporal and spatial interactions among large numbers of participants. A CAS not only evolves with a changing environment and adapts to it, it is closely coupled to all systems that constitute the environment. Thus, the ecosystem that contains the system and other systems evolves with the CAS as well. The effects of the emerging adaptation and co-evolution are difficult to capture with only combined mathematical and computational experimentation. Therefore, an ab initio flight simulation environment must accommodate individual vehicles, groups of self-organizing vehicles, and large-scale infrastructure behavior. Inspired by Massively Multiplayer Online Role Playing Games (MMORPG) and Serious Gaming, the proposed ab initio simulation environment is similar to online gaming environments in which player participants interact with each other, affect their environment, and expect the simulation to persist and change regardless of any individual player's active participation.
OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.
Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L
2017-10-05
The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Synthetic in vitro transcriptional oscillators
Kim, Jongmin; Winfree, Erik
2011-01-01
The construction of synthetic biochemical circuits from simple components illuminates how complex behaviors can arise in chemistry and builds a foundation for future biological technologies. A simplified analog of genetic regulatory networks, in vitro transcriptional circuits, provides a modular platform for the systematic construction of arbitrary circuits and requires only two essential enzymes, bacteriophage T7 RNA polymerase and Escherichia coli ribonuclease H, to produce and degrade RNA signals. In this study, we design and experimentally demonstrate three transcriptional oscillators in vitro. First, a negative feedback oscillator comprising two switches, regulated by excitatory and inhibitory RNA signals, showed up to five complete cycles. To demonstrate modularity and to explore the design space further, a positive-feedback loop was added that modulates and extends the oscillatory regime. Finally, a three-switch ring oscillator was constructed and analyzed. Mathematical modeling guided the design process, identified experimental conditions likely to yield oscillations, and explained the system's robust response to interference by short degradation products. Synthetic transcriptional oscillators could prove valuable for systematic exploration of biochemical circuit design principles and for controlling nanoscale devices and orchestrating processes within artificial cells. PMID:21283141
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Recent Experiments Conducted with the Wide-Field Imaging Interferometry Testbed (WIIT)
NASA Technical Reports Server (NTRS)
Leisawitz, David T.; Juanola-Parramon, Roser; Bolcar, Matthew; Iacchetta, Alexander S.; Maher, Stephen F.; Rinehart, Stephen A.
2016-01-01
The Wide-field Imaging Interferometry Testbed (WIIT) was developed at NASA's Goddard Space Flight Center to demonstrate and explore the practical limitations inherent in wide field-of-view double Fourier (spatio-spectral) interferometry. The testbed delivers high-quality interferometric data and is capable of observing spatially and spectrally complex hyperspectral test scenes. Although WIIT operates at visible wavelengths, by design the data are representative of those from a space-based far-infrared observatory. We used WIIT to observe a calibrated, independently characterized test scene of modest spatial and spectral complexity, and an astronomically realistic test scene of much greater spatial and spectral complexity. This paper describes the experimental setup, summarizes the performance of the testbed, and presents representative data.
Verification of the CFD simulation system SAUNA for complex aircraft configurations
NASA Astrophysics Data System (ADS)
Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.
1994-04-01
This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.
Protein-Protein Docking in Drug Design and Discovery.
Kaczor, Agnieszka A; Bartuzi, Damian; Stępniewski, Tomasz Maciej; Matosiuk, Dariusz; Selent, Jana
2018-01-01
Protein-protein interactions (PPIs) are responsible for a number of key physiological processes in the living cells and underlie the pathomechanism of many diseases. Nowadays, along with the concept of so-called "hot spots" in protein-protein interactions, which are well-defined interface regions responsible for most of the binding energy, these interfaces can be targeted with modulators. In order to apply structure-based design techniques to design PPIs modulators, a three-dimensional structure of protein complex has to be available. In this context in silico approaches, in particular protein-protein docking, are a valuable complement to experimental methods for elucidating 3D structure of protein complexes. Protein-protein docking is easy to use and does not require significant computer resources and time (in contrast to molecular dynamics) and it results in 3D structure of a protein complex (in contrast to sequence-based methods of predicting binding interfaces). However, protein-protein docking cannot address all the aspects of protein dynamics, in particular the global conformational changes during protein complex formation. In spite of this fact, protein-protein docking is widely used to model complexes of water-soluble proteins and less commonly to predict structures of transmembrane protein assemblies, including dimers and oligomers of G protein-coupled receptors (GPCRs). In this chapter we review the principles of protein-protein docking, available algorithms and software and discuss the recent examples, benefits, and drawbacks of protein-protein docking application to water-soluble proteins, membrane anchoring and transmembrane proteins, including GPCRs.
Methodology for extracting local constants from petroleum cracking flows
Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.
2000-01-01
A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.
Design complexity in termite-fishing tools of chimpanzees (Pan troglodytes)
Sanz, Crickette; Call, Josep; Morgan, David
2009-01-01
Adopting the approach taken with New Caledonian crows (Corvus moneduloides), we present evidence of design complexity in one of the termite-fishing tools of chimpanzees (Pan troglodytes) in the Goualougo Triangle, Republic of Congo. Prior to termite fishing, chimpanzees applied a set of deliberate, distinguishable actions to modify herb stems to fashion a brush-tipped probe, which is different from the form of fishing tools used by chimpanzees in East and West Africa. This means that ‘brush-tipped fishing probes’, unlike ‘brush sticks’, are not a by-product of use but a deliberate design feature absent in other chimpanzee populations. The specialized modifications to prepare the tool for termite fishing, measures taken to repair non-functional brushes and appropriate orientation of the modified end suggest that these wild chimpanzees are attentive to tool modifications. We also conducted experimental trials that showed that a brush-tipped probe is more effective in gathering insects than an unmodified fishing probe. Based on these findings, we suggest that chimpanzees in the Congo Basin have developed an improved fishing probe design. PMID:19324641
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
PIV measurements in a compact return diffuser under multi-conditions
NASA Astrophysics Data System (ADS)
Zhou, L.; Lu, W. G.; Shi, W. D.
2013-12-01
Due to the complex three-dimensional geometries of impellers and diffusers, their design is a delicate and difficult task. Slight change could lead to significant changes in hydraulic performance and internal flow structure. Conversely, the grasp of the pump's internal flow pattern could benefit from pump design improvement. The internal flow fields in a compact return diffuser have been investigated experimentally under multi-conditions. A special Particle Image Velocimetry (PIV) test rig is designed, and the two-dimensional PIV measurements are successfully conducted in the diffuser mid-plane to capture the complex flow patterns. The analysis of the obtained results has been focused on the flow structure in diffuser, especially under part-load conditions. The vortex and recirculation flow patterns in diffuser are captured and analysed accordingly. Strong flow separation and back flow appeared at the part-load flow rates. Under the design and over-load conditions, the flow fields in diffuser are uniform, and the flow separation and back flow appear at the part-load flow rates, strong back flow is captured at one diffuser passage under 0.2Qdes.
New strategy for protein interactions and application to structure-based drug design
NASA Astrophysics Data System (ADS)
Zou, Xiaoqin
One of the greatest challenges in computational biophysics is to predict interactions between biological molecules, which play critical roles in biological processes and rational design of therapeutic drugs. Biomolecular interactions involve delicate interplay between multiple interactions, including electrostatic interactions, van der Waals interactions, solvent effect, and conformational entropic effect. Accurate determination of these complex and subtle interactions is challenging. Moreover, a biological molecule such as a protein usually consists of thousands of atoms, and thus occupies a huge conformational space. The large degrees of freedom pose further challenges for accurate prediction of biomolecular interactions. Here, I will present our development of physics-based theory and computational modeling on protein interactions with other molecules. The major strategy is to extract microscopic energetics from the information embedded in the experimentally-determined structures of protein complexes. I will also present applications of the methods to structure-based therapeutic design. Supported by NSF CAREER Award DBI-0953839, NIH R01GM109980, and the American Heart Association (Midwest Affiliate) [13GRNT16990076].
Design Considerations for Developing Biodegradable Magnesium Implants
NASA Astrophysics Data System (ADS)
Brar, Harpreet S.; Keselowsky, Benjamin G.; Sarntinoranont, Malisa; Manuel, Michele V.
The integration of biodegradable and bioabsorbable magnesium implants into the human body is a complex undertaking that faces major challenges. The complexity arises from the fact that biomaterials must meet both engineering and physiological requirements to ensure the desired properties. Historically, efforts have been focused on the behavior of commercial magnesium alloys in biological environments and their resultant effect on cell-mediated processes. Developing causal relationships between alloy chemistry and micro structure, and its effect on cellular behavior can be a difficult and time intensive process. A systems design approach driven by thermodynamics has the power to provide significant contributions in developing the next generation of magnesium alloy implants with controlled degradability, biocompatibility, and optimized mechanical properties, at reduced time and cost. This approach couples experimental research with theory and mechanistic modeling for the accelerated development of materials. The aim of this article is to enumerate this strategy, design considerations and hurdles for developing new magnesium alloys for use as biodegradable implant materials [1].
Kim, Yong Sun; Choi, Hyeong Ho; Cho, Young Nam; Park, Yong Jae; Lee, Jong B; Yang, King H; King, Albert I
2005-11-01
Although biomechanical studies on the knee-thigh-hip (KTH) complex have been extensive, interactions between the KTH and various vehicular interior design parameters in frontal automotive crashes for newer models have not been reported in the open literature to the best of our knowledge. A 3D finite element (FE) model of a 50(th) percentile male KTH complex, which includes explicit representations of the iliac wing, acetabulum, pubic rami, sacrum, articular cartilage, femoral head, femoral neck, femoral condyles, patella, and patella tendon, has been developed to simulate injuries such as fracture of the patella, femoral neck, acetabulum, and pubic rami of the KTH complex. Model results compared favorably against regional component test data including a three-point bending test of the femur, axial loading of the isolated knee-patella, axial loading of the KTH complex, axial loading of the femoral head, and lateral loading of the isolated pelvis. The model was further integrated into a Wayne State University upper torso model and validated against data obtained from whole body sled tests. The model was validated against these experimental data over a range of impact speeds, impactor masses and boundary conditions. Using Design Of Experiment (DOE) methods based on Taguchi's approach and the developed FE model of the whole body, including the KTH complex, eight vehicular interior design parameters, namely the load limiter force, seat belt elongation, pretensioner inlet amount, knee-knee bolster distance, knee bolster angle, knee bolster stiffness, toe board angle and impact speed, each with either two or three design levels, were simulated to predict their respective effects on the potential of KTH injury in frontal impacts. Simulation results proposed best design levels for vehicular interior design parameters to reduce the injury potential of the KTH complex due to frontal automotive crashes. This study is limited by the fact that prediction of bony fracture was based on an element elimination method available in the LS-DYNA code. No validation study was conducted to determine if this method is suitable when simulating fractures of biological tissues. More work is still needed to further validate the FE model of the KTH complex to increase its reliability in the assessment of various impact loading conditions associated with vehicular crash scenarios.
linkedISA: semantic representation of ISA-Tab experimental metadata.
González-Beltrán, Alejandra; Maguire, Eamonn; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2014-01-01
Reporting and sharing experimental metadata- such as the experimental design, characteristics of the samples, and procedures applied, along with the analysis results, in a standardised manner ensures that datasets are comprehensible and, in principle, reproducible, comparable and reusable. Furthermore, sharing datasets in formats designed for consumption by humans and machines will also maximize their use. The Investigation/Study/Assay (ISA) open source metadata tracking framework facilitates standards-compliant collection, curation, visualization, storage and sharing of datasets, leveraging on other platforms to enable analysis and publication. The ISA software suite includes several components used in increasingly diverse set of life science and biomedical domains; it is underpinned by a general-purpose format, ISA-Tab, and conversions exist into formats required by public repositories. While ISA-Tab works well mainly as a human readable format, we have also implemented a linked data approach to semantically define the ISA-Tab syntax. We present a semantic web representation of the ISA-Tab syntax that complements ISA-Tab's syntactic interoperability with semantic interoperability. We introduce the linkedISA conversion tool from ISA-Tab to the Resource Description Framework (RDF), supporting mappings from the ISA syntax to multiple community-defined, open ontologies and capitalising on user-provided ontology annotations in the experimental metadata. We describe insights of the implementation and how annotations can be expanded driven by the metadata. We applied the conversion tool as part of Bio-GraphIIn, a web-based application supporting integration of the semantically-rich experimental descriptions. Designed in a user-friendly manner, the Bio-GraphIIn interface hides most of the complexities to the users, exposing a familiar tabular view of the experimental description to allow seamless interaction with the RDF representation, and visualising descriptors to drive the query over the semantic representation of the experimental design. In addition, we defined queries over the linkedISA RDF representation and demonstrated its use over the linkedISA conversion of datasets from Nature' Scientific Data online publication. Our linked data approach has allowed us to: 1) make the ISA-Tab semantics explicit and machine-processable, 2) exploit the existing ontology-based annotations in the ISA-Tab experimental descriptions, 3) augment the ISA-Tab syntax with new descriptive elements, 4) visualise and query elements related to the experimental design. Reasoning over ISA-Tab metadata and associated data will facilitate data integration and knowledge discovery.
A microfluidic investigation of gas exsolution in glass and shale fracture networks
NASA Astrophysics Data System (ADS)
Porter, M. L.; Jimenez-Martinez, J.; Harrison, A.; Currier, R.; Viswanathan, H. S.
2016-12-01
Microfluidic investigations of pore-scale fluid flow and transport phenomena has steadily increased in recent years. In these investigations fluid flow is restricted to two-dimensions allowing for real-time visualization and quantification of complex flow and reactive transport behavior, which is difficult to obtain in other experimental systems. In this work, we describe a unique high pressure (up to 10.3 MPa) and temperature (up to 80 °C) microfluidics experimental system that allows us to investigate fluid flow and transport in geo-material (e.g., shale, Portland cement, etc.) micromodels. The use of geo-material micromodels allows us to better represent fluid-rock interactions including wettability, chemical reactivity, and nano-scale porosity at conditions representative of natural subsurface environments. Here, we present experimental results in fracture systems with applications to hydrocarbon mobility in fractured rocks. Complex fracture network patterns are derived from 3D x-ray tomography images of actual fractures created in shale rock cores. We use both shale and glass micromodels, allowing for a detailed comparison between flow phenomena in the different materials. We discuss results from two-phase gas (CO2 and N2) injection experiments designed to enhance oil recovery. In these experiments gas was injected into micromodels saturated with oil and allowed to soak for approximately 12 hours at elevated pressures. The pressure in the system was then decreased to atmospheric, causing the gas to expand and/or dissolve out of solution, subsequently mobilizing the oil. In addition to the experimental results, we present a relatively simple model designed to quantify the amount of oil mobilized as a function of decreasing system pressure. We will show comparisons between the experiments and model, and discuss the potential use of the model in field-scale reservoir simulations.
ERIC Educational Resources Information Center
Biesma, R. G.; Pavlova, M.; van Merode, G. G.; Groot, W.
2007-01-01
This paper uses an experimental design to estimate preferences of employers for key competencies during the transition from initial education to the labor market. The study is restricted to employers of entry-level academic graduates entering public health organizations in the Netherlands. Given the changing and complex demands in public health,…
An Experimental and Theoretical Study on Cavitating Propellers.
1982-10-01
34 And Identfyp eV &to" nMeeJ cascade flow theoretical supercavitating flow performance prediction method partially cavitating flow supercavitating ...the present work was to develop an analytical tool for predicting the off-design performance of supercavitating propellers over a wide range of...operating conditions. Due to the complex nature of the flow phenomena, a lifting line theory sirply combined with the two-dimensional supercavitating
An approach to achieve progress in spacecraft shielding
NASA Astrophysics Data System (ADS)
Thoma, K.; Schäfer, F.; Hiermaier, S.; Schneider, E.
2004-01-01
Progress in shield design against space debris can be achieved only when a combined approach based on several tools is used. This approach depends on the combined application of advanced numerical methods, specific material models and experimental determination of input parameters for these models. Examples of experimental methods for material characterization are given, covering the range from quasi static to very high strain rates for materials like Nextel and carbon fiber-reinforced materials. Mesh free numerical methods have extraordinary capabilities in the simulation of extreme material behaviour including complete failure with phase changes, combined with shock wave phenomena and the interaction with structural components. In this paper the benefits from combining numerical methods, material modelling and detailed experimental studies for shield design are demonstrated. The following examples are given: (1) Development of a material model for Nextel and Kevlar-Epoxy to enable numerical simulation of hypervelocity impacts on complex heavy protection shields for the International Space Station. (2) The influence of projectile shape on protection performance of Whipple Shields and how experimental problems in accelerating such shapes can be overcome by systematic numerical simulation. (3) The benefits of using metallic foams in "sandwich bumper shields" for spacecraft and how to approach systematic characterization of such materials.
Aranda, Esther Escribano; Matias, Tiago Araújo; Araki, Koiti; Vieira, Adriana Pires; de Mattos, Elaine Andrade; Colepicolo, Pio; Luz, Carolina Portela; Marques, Fábio Luiz Navarro; da Costa Ferreira, Ana Maria
2016-12-01
Herein, the design and syntheses of two new mononuclear oxindolimine-copper(II) (1 and 2) and corresponding heterobinuclear oxindolimine Cu(II)Pt(II) complexes (3 and 4), are described. All the isolated complexes were characterized by spectroscopic techniques (UV/Vis, IR, EPR), in addition to elemental analysis and mass spectrometry. Cyclic voltammetry (CV) measurements showed that in all cases, one-electron quasi-reversible waves were observed, and ascribed to the formation of corresponding copper(I) complexes. Additionally, waves related to oxindolimine ligand reduction was verified, and confirmed using analogous oxindolimine-Zn(II) complexes. The Pt(IV/II) reduction, and corresponding oxidation, for complexes 3 and 4 occurred at very close values to those observed for cisplatin. By complementary fluorescence studies, it was shown that glutathione (GSH) cannot reduce any of these complexes, under the experimental conditions (room temperature, phosphate buffer 50mM, pH7.4), using an excess of 20-fold [GSH]. All these complexes showed characteristic EPR spectral profile, with parameters values g ǁ >g ⊥ suggesting an axially distorted environment around the copper(II) center. Interactions with calf thymus-DNA, monitored by circular dichroism (CD), indicated different effects modulated by the ligands. Finally, the cytotoxicity of each complex was tested toward different tumor cells, in comparison to cisplatin, and low values of IC 50 in the range 0.6 to 4.0μM were obtained, after 24 or 48h incubation at 37°C. The obtained results indicate that such complexes can be promising alternative antitumor agents. Copyright © 2016 Elsevier Inc. All rights reserved.
Designing with Protocells: Applications of a Novel Technical Platform
Armstrong, Rachel
2014-01-01
The paper offers a design perspective on protocell applications and presents original research that characterizes the life-like qualities of the Bütschli dynamic droplet system, as a particular “species” of protocell. Specific focus is given to the possibility of protocell species becoming a technical platform for designing and engineering life-like solutions to address design challenges. An alternative framing of the protocell, based on process philosophy, sheds light on its capabilities as a technology that can deal with probability and whose ontology is consistent with complexity, nonlinear dynamics and the flow of energy and matter. However, the proposed technical systems do not yet formally exist as products or mature technologies. Their potential applications are therefore experimentally examined within a design context as architectural “projects”—an established way of considering proposals that have not yet been realized, like an extended hypothesis. Exemplary design-led projects are introduced, such as The Hylozoic Ground and Future Venice, which aim to “discover”, rather than “solve”, challenges to examine a set of possibilities that have not yet been resolved. The value of such exploration in design practice is in opening up a set of potential directions for further assessment before complex challenges are procedurally implemented. PMID:25370381
Shivakumar, H N; Desai, B G; Pandya, Saumyak; Karki, S S
2007-01-01
Glipizide was complexed with beta-cyclodextrin in an attempt to enhance the drug solubility. The phase solubility diagram was classified as A(L) type, which was characterized by an apparent 1:1 stability constant that had a value of 413.82 M(-1). Fourier transform infrared spectrophotometry, differential scanning calorimetry, powder x-ray diffractometry and proton nuclear magnetic resonance spectral analysis indicated considerable interaction between the drug and beta-cyclodextrin. A 2(3) factorial design was employed to prepare hydroxypropyl methylcellulose (HPMC) matrix tablets containing the drug or its complex. The effect of the total polymer loads (X1), levels of HPMC K100LV (X9), and complexation (X3) on release at first hour (Y1), 24 h (Y2), time taken for 50% release (Y3), and diffusion exponent (Y4) was systematically analyzed using the F test. Mathematical models containing only the significant terms (P < 0.05) were generated for each parameter by multiple linear regression analysis and analysis of variance. Complexation was found to exert a significant effect on Y1, Y2, and Y3, whereas total polymer loads significantly influenced all the responses. The models generated were validated by developing two new formulations with a combination of factors within the experimental domain. The experimental values of the response parameters were in close agreement with the predicted values, thereby proving-the validity of the generated mathematical models.
Exploration on the matching between Optical Comprehensive Design Experiment and Washington Accord
NASA Astrophysics Data System (ADS)
Cao, Yiping; Chen, Wenjing; Zhang, Qican; Liu, Yuankun; Li, Dahai; Zhou, Xinzhi; Wei, Jun
2017-08-01
Common problems faced in optical comprehensive design experiment and going against the Washington Accord are pointed out. For resolving these problems, an instructional and innovative teaching scheme for Optics Comprehensive Design Experiment is proposed. We would like to understand the student that can improve the hands-on practical ability, theory knowledge understanding ability, complex problem solving ability, engineering application ability, cooperative ability after tracking and researching the student who have attended the class about Optical Comprehensive Design Experiment, We found that there are some problems on the course such as the experiment content vague, the student beginning less time, phase separation theory and engineering application, the experiment content lack of selectivity and so on. So we have made some improvements reference to the Washington Accord for the class teaching plan about Optical Comprehensive Design Experiment. This class must relevant to the engineering basic courses, professional foundation course and the major courses, so far as to the future study and work that which can play a role in inheriting and continuity to the students. The Optical Comprehensive Design Experiment teaching program requires students learning this course to have learnt basic courses like analog electronics technique, digital electronic technique, applied optics and computer and other related courses which students are required to comprehensively utilize. This teaching scheme contains six practical complex engineering problems which are respectively optical system design, light energy meter design, illuminometer design, material refractive index measuring system design, light intensity measuring system design and open design. Establishing the optional experiment and open experiment can provide students with a greater choice and enhance the students' creativity, vivid teaching experimental teachers and enriching contents of experiment can make the experiment more interesting, providing students with more opportunities to conduct experiment and improving students' practical ability with long learning time, putting emphasis on student's understanding of complex engineering problems and the cognitive of the process to solve complex engineering problems with actual engineering problems. Applying the scheme in other courses and improving accordingly will be able to ensure the quality of engineering education. Look forward to offering useful reference for the curriculum system construction in colleges and universities.
Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores
Flores, Lorea; Bailey, R. A.; Elosegi, Arturo; Larrañaga, Aitor; Reiss, Julia
2016-01-01
Habitat complexity can influence predation rates (e.g. by providing refuge) but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants), in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants); and 3. as the spatial configuration of structures (measured as fractal dimension). The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology). We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, ‘habitat complexity’ by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems. PMID:27802267
Castro, G T; Blanco, S E; Arce, S L; Ferretti, F H
2003-10-01
The complexation reaction between AlCl(3) and 2,4-dihydroxy-benzophenone with varying permittivity and ionic strength of the reaction medium was investigated by theoretical and experimental procedures, namely, density functional (DFT) and UV-vis spectroscopic methods, respectively. The stoichiometric composition of the complex formed, which was determined by means of the molar ratio method, is 1:1. The molar absorptivity and stability constant of the complex were determined using a method designed by the authors. It was observed that the stoichiometric composition of the complex does not change with the used solvents and that the stability constant in methanol is higher than ethanol. Kinetic experiments in solutions with different ionic strength were also performed. The results obtained permit to conclude that the complex is formed through of a mechanism whose rate-determining step is a reaction between two ions with opposite unitary charges. In the theoretical study performed at the B3LYP/6-31G(d) level of theory using Tomasi's model, it was proposed that the formation of the complex involves one simple covalent bond between the aluminum atom and the oxygen atom of o-hydroxyl group of the ligand and a stronger coulombic attraction (or a second covalent bond) between the central atom and the carbonyl oxygen atom of 2,4-dihydroxy-benzophenone. Using the calculated magnitudes, it was predicted that the complex formed has higher thermodynamic stability in methanol than ethanol. It was also concluded that the planarity of the chelate ring favors a greater planarity of 4-hydroxy-benzoyl group of the complex with respect to the ligand, which agrees with the observed batochromic shifts. The formulated theoretical conclusions satisfactorily match the experimental determinations performed.
Using X-ray absorption to probe sulfur oxidation states in complex molecules
NASA Astrophysics Data System (ADS)
Vairavamurthy, A.
1998-10-01
X-ray absorption near-edge structure (XANES) spectroscopy offers an important non-destructive tool for determining oxidation states and for characterizing chemical speciation. The technique was used to experimentally verify the oxidation states of sulfur in different types of complex molecules because there are irregularities and uncertainties in assigning the values traditionally. The usual practice of determining oxidation states involves using a set of conventional rules. The oxidation state is an important control in the chemical speciation of sulfur, ranging from -2 to +6 in its different compounds. Experimental oxidation-state values for various types of sulfur compounds, using their XANES peak-energy positions, were assigned from a scale in which elemental sulfur and sulfate are designated as 0 and +6, respectively. Because these XANES-based values differed considerably from conventionally determined oxidation states for most sulfur compounds, a new term 'oxidation index' was coined to describe them. The experimental values were closer to those conventional values obtained by assigning shared electrons to the more electronegative atoms than to those based on other customary rules for assigning them. Because the oxidation index is distinct and characteristic for each different type of sulfur functionality, it becomes an important parameter for characterizing sulfur species, and for experimentally verifying uncertain oxidation states.
Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie
2016-01-01
The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.
Removal of waterborne microorganisms by filtration using clay-polymer complexes.
Undabeytia, Tomas; Posada, Rosa; Nir, Shlomo; Galindo, Irene; Laiz, Leonila; Saiz-Jimenez, Cesareo; Morillo, Esmeralda
2014-08-30
Clay-polymer composites were designed for use in filtration processes for disinfection during the course of water purification. The composites were formed by sorption of polymers based on starch modified with quaternary ammonium ethers onto the negatively charged clay mineral bentonite. The performance of the clay-polymer complexes in removal of bacteria was strongly dependent on the conformation adopted by the polycation on the clay surface, the charge density of the polycation itself and the ratio between the concentrations of clay and polymer used during the sorption process. The antimicrobial effect exerted by the clay-polymer system was due to the cationic monomers adsorbed on the clay surface, which resulted in a positive surface potential of the complexes and charge reversal. Clay-polymer complexes were more toxic to bacteria than the polymers alone. Filtration employing our optimal clay-polymer composite yielded 100% removal of bacteria after the passage of 3L, whereas an equivalent filter with granular activated carbon (GAC) hardly yielded removal of bacteria after 0.5L. Regeneration of clay-polymer complexes saturated with bacteria was demonstrated. Modeling of the filtration processes permitted to optimize the design of filters and estimation of experimental conditions for purifying large water volumes in short periods. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ahmouda, Somaya
To perform photosynthesis, plants, algae and bacteria possess well organized and closely coupled photosynthetic pigment-protein complexes. Information on energy transfer in photosynthetic complexes is important to understand their functioning and possibly to design new and improved photovoltaic devices. The information on energy transfer processes contained in the narrow zero-phonon lines at low temperatures is hidden under the inhomogeneous broadening. Thus, it has been proven difficult to analyze the spectroscopic properties of these complexes in sufficient detail by conventional spectroscopy methods. In this context the high resolution spectroscopy techniques such as Spectral Hole Burning are powerful tools designed to get around the inhomogeneous broadening. Spectral Hole Burning involves selective excitation by a laser which removes molecules with the zero-phonon transitions resonant with this laser. This thesis focuses on the effects of the distributions of the energy transfer rates (homogeneous line widths) on the evolution of spectral holes. These distributions are a consequence of the static disorder in the photosynthetic pigment-protein complexes. The qualitative effects of different types of the line width distributions on the evolution of spectral holes have been and explored by numerical simulations, an example of analysis of the original experimental data has been presented as well.
Structural similitude and design of scaled down laminated models
NASA Technical Reports Server (NTRS)
Simitses, G. J.; Rezaeepazhand, J.
1993-01-01
The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial similarity are discussed. The procedure consists of systematically observing the effect of each parameter and corresponding scaling laws. Then acceptable intervals and limitations for these parameters and scaling laws are discussed. In each case, a set of valid scaling factors and corresponding response scaling laws that accurately predict the response of prototypes from experimental models is introduced. The examples used include rectangular laminated plates under destabilizing loads, applied individually, vibrational characteristics of same plates, as well as cylindrical bending of beam-plates.
Model-based metabolism design: constraints for kinetic and stoichiometric models
Stalidzans, Egils; Seiman, Andrus; Peebo, Karl; Komasilovs, Vitalijs; Pentjuss, Agris
2018-01-01
The implementation of model-based designs in metabolic engineering and synthetic biology may fail. One of the reasons for this failure is that only a part of the real-world complexity is included in models. Still, some knowledge can be simplified and taken into account in the form of optimization constraints to improve the feasibility of model-based designs of metabolic pathways in organisms. Some constraints (mass balance, energy balance, and steady-state assumption) serve as a basis for many modelling approaches. There are others (total enzyme activity constraint and homeostatic constraint) proposed decades ago, but which are frequently ignored in design development. Several new approaches of cellular analysis have made possible the application of constraints like cell size, surface, and resource balance. Constraints for kinetic and stoichiometric models are grouped according to their applicability preconditions in (1) general constraints, (2) organism-level constraints, and (3) experiment-level constraints. General constraints are universal and are applicable for any system. Organism-level constraints are applicable for biological systems and usually are organism-specific, but these constraints can be applied without information about experimental conditions. To apply experimental-level constraints, peculiarities of the organism and the experimental set-up have to be taken into account to calculate the values of constraints. The limitations of applicability of particular constraints for kinetic and stoichiometric models are addressed. PMID:29472367
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide a useful engineering technology base in the development of hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed and manufactured for conducting experimental investigations. Oxidizer (LOX or GOX) supply and control systems have been designed and partly constructed for the head-end injection into the test chamber. Experiments using HTPB fuel, as well as fuels supplied by NASA designated industrial companies will be conducted. Design and construction of fuel casting molds and sample holders have been completed. The portion of these items for industrial company fuel casting will be sent to the McDonnell Douglas Aerospace Corporation in the near future. The study focuses on the following areas: observation of solid fuel burning processes with LOX or GOX, measurement and correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study (Part 2) also being conducted at PSU.
Autism genetics: Methodological issues and experimental design.
Sacco, Roberto; Lintas, Carla; Persico, Antonio M
2015-10-01
Autism is a complex neuropsychiatric disorder of developmental origin, where multiple genetic and environmental factors likely interact resulting in a clinical continuum between "affected" and "unaffected" individuals in the general population. During the last two decades, relevant progress has been made in identifying chromosomal regions and genes in linkage or association with autism, but no single gene has emerged as a major cause of disease in a large number of patients. The purpose of this paper is to discuss specific methodological issues and experimental strategies in autism genetic research, based on fourteen years of experience in patient recruitment and association studies of autism spectrum disorder in Italy.
Single-shot temporal characterization of kilojoule-level, picosecond pulses on OMEGA EP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waxer, Leon; Dorrer, Christophe; Kalb, Adam
To achieve a variety of experimental conditions, the OMEGA EP laser provides kilojoule-level pulses over a pulse-width range of 0.6 to 100 ps. Precise knowledge of the pulse width is important for laser system safety and the interpretation of experimental results. This paper describes the development and implementation of a single-shot, ultrashort-pulse measurement diagnostic, which provides an accurate characterization of the output pulse shape. We also present a brief overview of the measurement algorithm; discuss design considerations necessary for implementation in a complex, user-facility environment; and review the results of the diagnostic commissioning shots, which demonstrated excellent agreement with predictions.
Single-shot temporal characterization of kilojoule-level, picosecond pulses on OMEGA EP
Waxer, Leon; Dorrer, Christophe; Kalb, Adam; ...
2018-02-19
To achieve a variety of experimental conditions, the OMEGA EP laser provides kilojoule-level pulses over a pulse-width range of 0.6 to 100 ps. Precise knowledge of the pulse width is important for laser system safety and the interpretation of experimental results. This paper describes the development and implementation of a single-shot, ultrashort-pulse measurement diagnostic, which provides an accurate characterization of the output pulse shape. We also present a brief overview of the measurement algorithm; discuss design considerations necessary for implementation in a complex, user-facility environment; and review the results of the diagnostic commissioning shots, which demonstrated excellent agreement with predictions.
NASA Astrophysics Data System (ADS)
Wen, Di; Ding, Xiaoqing
2003-12-01
In this paper we propose a general framework for character segmentation in complex multilingual documents, which is an endeavor to combine the traditionally separated segmentation and recognition processes into a cooperative system. The framework contains three basic steps: Dissection, Local Optimization and Global Optimization, which are designed to fuse various properties of the segmentation hypotheses hierarchically into a composite evaluation to decide the final recognition results. Experimental results show that this framework is general enough to be applied in variety of documents. A sample system based on this framework to recognize Chinese, Japanese and Korean documents and experimental performance is reported finally.
Parameter Estimation for Viscoplastic Material Modeling
NASA Technical Reports Server (NTRS)
Saleeb, Atef F.; Gendy, Atef S.; Wilt, Thomas E.
1997-01-01
A key ingredient in the design of engineering components and structures under general thermomechanical loading is the use of mathematical constitutive models (e.g. in finite element analysis) capable of accurate representation of short and long term stress/deformation responses. In addition to the ever-increasing complexity of recent viscoplastic models of this type, they often also require a large number of material constants to describe a host of (anticipated) physical phenomena and complicated deformation mechanisms. In turn, the experimental characterization of these material parameters constitutes the major factor in the successful and effective utilization of any given constitutive model; i.e., the problem of constitutive parameter estimation from experimental measurements.
NASA Astrophysics Data System (ADS)
Wei, Qi; Tian, Ye; Zuo, Shu-Yu; Cheng, Ying; Liu, Xiao-Jun
2017-03-01
Acoustic topological states support sound propagation along the boundary in a one-way direction with inherent robustness against defects and disorders, leading to the revolution of the manipulation on acoustic waves. A variety of acoustic topological states relying on circulating fluid, chiral coupling, or temporal modulation have been proposed theoretically. However, experimental demonstration has so far remained a significant challenge, due to the critical limitations such as structural complexity and high losses. Here, we experimentally demonstrate an acoustic anomalous Floquet topological insulator in a waveguide network. The acoustic gapless edge states can be found in the band gap when the waveguides are strongly coupled. The scheme features simple structure and high-energy throughput, leading to the experimental demonstration of efficient and robust topologically protected sound propagation along the boundary. The proposal may offer a unique, promising application for design of acoustic devices in acoustic guiding, switching, isolating, filtering, etc.
Side scanner for supermarkets: a new scanner design standard
NASA Astrophysics Data System (ADS)
Cheng, Charles K.; Cheng, J. K.
1996-09-01
High speed UPC bar code has become a standard mode of data capture for supermarkets in the US, Europe, and Japan. The influence of the ergonomics community on the design of the scanner is evident. During the past decade the ergonomic issues of cashier in check-outs has led to occupational hand-wrist cumulative trauma disorders, in most cases causing carpal tunnel syndrome, a permanent hand injury. In this paper, the design of a side scanner to resolve the issues is discussed. The complex optical module and the sensor for aforesaid side scanner is described. The ergonomic advantages offer the old counter mounted vertical scanner has been experimentally proved by the industrial funded study at an independent university.
He, Wei; Yurkevich, Igor V; Canham, Leigh T; Loni, Armando; Kaplan, Andrey
2014-11-03
We develop an analytical model based on the WKB approach to evaluate the experimental results of the femtosecond pump-probe measurements of the transmittance and reflectance obtained on thin membranes of porous silicon. The model allows us to retrieve a pump-induced nonuniform complex dielectric function change along the membrane depth. We show that the model fitting to the experimental data requires a minimal number of fitting parameters while still complying with the restriction imposed by the Kramers-Kronig relation. The developed model has a broad range of applications for experimental data analysis and practical implementation in the design of devices involving a spatially nonuniform dielectric function, such as in biosensing, wave-guiding, solar energy harvesting, photonics and electro-optical devices.
An integrated radiation physics computer code system.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Harris, D. W.
1972-01-01
An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.
Inverse Doppler Effects in Broadband Acoustic Metamaterials
Zhai, S. L.; Zhao, X. P.; Liu, S.; Shen, F. L.; Li, L. L.; Luo, C. R.
2016-01-01
The Doppler effect refers to the change in frequency of a wave source as a consequence of the relative motion between the source and an observer. Veselago theoretically predicted that materials with negative refractions can induce inverse Doppler effects. With the development of metamaterials, inverse Doppler effects have been extensively investigated. However, the ideal material parameters prescribed by these metamaterial design approaches are complex and also challenging to obtain experimentally. Here, we demonstrated a method of designing and experimentally characterising arbitrary broadband acoustic metamaterials. These omni-directional, double-negative, acoustic metamaterials are constructed with ‘flute-like’ acoustic meta-cluster sets with seven double meta-molecules; these metamaterials also overcome the limitations of broadband negative bulk modulus and mass density to provide a region of negative refraction and inverse Doppler effects. It was also shown that inverse Doppler effects can be detected in a flute, which has been popular for thousands of years in Asia and Europe. PMID:27578317
Inverse Doppler Effects in Broadband Acoustic Metamaterials
NASA Astrophysics Data System (ADS)
Zhai, S. L.; Zhao, X. P.; Liu, S.; Shen, F. L.; Li, L. L.; Luo, C. R.
2016-08-01
The Doppler effect refers to the change in frequency of a wave source as a consequence of the relative motion between the source and an observer. Veselago theoretically predicted that materials with negative refractions can induce inverse Doppler effects. With the development of metamaterials, inverse Doppler effects have been extensively investigated. However, the ideal material parameters prescribed by these metamaterial design approaches are complex and also challenging to obtain experimentally. Here, we demonstrated a method of designing and experimentally characterising arbitrary broadband acoustic metamaterials. These omni-directional, double-negative, acoustic metamaterials are constructed with ‘flute-like’ acoustic meta-cluster sets with seven double meta-molecules; these metamaterials also overcome the limitations of broadband negative bulk modulus and mass density to provide a region of negative refraction and inverse Doppler effects. It was also shown that inverse Doppler effects can be detected in a flute, which has been popular for thousands of years in Asia and Europe.
Simulating and assessing boson sampling experiments with phase-space representations
NASA Astrophysics Data System (ADS)
Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.
2018-04-01
The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.
Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models
Eckert, Alissa M.; Tumpey, Terrence M.; Maines, Taronna R.
2016-01-01
SUMMARY Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. PMID:27412880
Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models.
Belser, Jessica A; Eckert, Alissa M; Tumpey, Terrence M; Maines, Taronna R
2016-09-01
Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Martins, Marina C M; Caldana, Camila; Wolf, Lucia Daniela; de Abreu, Luis Guilherme Furlan
2018-01-01
The output of metabolomics relies to a great extent upon the methods and instrumentation to identify, quantify, and access spatial information on as many metabolites as possible. However, the most modern machines and sophisticated tools for data analysis cannot compensate for inappropriate harvesting and/or sample preparation procedures that modify metabolic composition and can lead to erroneous interpretation of results. In addition, plant metabolism has a remarkable degree of complexity, and the number of identified compounds easily surpasses the number of samples in metabolomics analyses, increasing false discovery risk. These aspects pose a large challenge when carrying out plant metabolomics experiments. In this chapter, we address the importance of a proper experimental design taking into consideration preventable complications and unavoidable factors to achieve success in metabolomics analysis. We also focus on quality control and standardized procedures during the metabolomics workflow.
Development and validation of a lateral MREs isolator
NASA Astrophysics Data System (ADS)
Xing, Zhi-Wei; Yu, Miao; Fu, Jie; Zhao, Lu-Jie
2015-02-01
A novel lateral vibration isolator utilizing magnetorheological elastomers (MREs) with the field-dependent damping and stiffness was proposed in order to improve the adaptive performance. First, soft silicone rubber MREs with a highly adjustable shear storage modulus was fabricated. Then, the lateral MREs isolator was developed with a unique laminated structure of MRE layers and steel plates, which enables to withstand large vertical loads and adapts to the situation of large lateral displacement. Also, the electromagnetic analysis and design employed electromagnetic finite element method (FEM) to optimize magnetic circuit inside the proposed device. To evaluate the effectiveness of the lateral MREs isolator, a series of experimental tests were carried out under various applied magnetic fields. Experimental results show that the proposed MREs isolator can triumphantly change the lateral stiffness and equivalent damping up to 140% and 125%, respectively. This work demonstrates the performance of the designed lateral MREs isolator and its capacity in vibration mitigation for the complex situation.
Spinks, Jean; Mortimer, Duncan
2016-02-03
The provision of additional information is often assumed to improve consumption decisions, allowing consumers to more accurately weigh the costs and benefits of alternatives. However, increasing the complexity of decision problems may prompt changes in information processing. This is particularly relevant for experimental methods such as discrete choice experiments (DCEs) where the researcher can manipulate the complexity of the decision problem. The primary aims of this study are (i) to test whether consumers actually process additional information in an already complex decision problem, and (ii) consider the implications of any such 'complexity-driven' changes in information processing for design and analysis of DCEs. A discrete choice experiment (DCE) is used to simulate a complex decision problem; here, the choice between complementary and conventional medicine for different health conditions. Eye-tracking technology is used to capture the number of times and the duration that a participant looks at any part of a computer screen during completion of DCE choice sets. From this we can analyse what has become known in the DCE literature as 'attribute non-attendance' (ANA). Using data from 32 participants, we model the likelihood of ANA as a function of choice set complexity and respondent characteristics using fixed and random effects models to account for repeated choice set completion. We also model whether participants are consistent with regard to which characteristics (attributes) they consider across choice sets. We find that complexity is the strongest predictor of ANA when other possible influences, such as time pressure, ordering effects, survey specific effects and socio-demographic variables (including proxies for prior experience with the decision problem) are considered. We also find that most participants do not apply a consistent information processing strategy across choice sets. Eye-tracking technology shows promise as a way of obtaining additional information from consumer research, improving DCE design, and informing the design of policy measures. With regards to DCE design, results from the present study suggest that eye-tracking data can identify the point at which adding complexity (and realism) to DCE choice scenarios becomes self-defeating due to unacceptable increases in ANA. Eye-tracking data therefore has clear application in the construction of guidelines for DCE design and during piloting of DCE choice scenarios. With regards to design of policy measures such as labelling requirements for CAM and conventional medicines, the provision of additional information has the potential to make difficult decisions even harder and may not have the desired effect on decision-making.
Pseudo-Random Sequence Modifications for Ion Mobility Orthogonal Time of Flight Mass Spectrometry
Clowers, Brian H.; Belov, Mikhail E.; Prior, David C.; Danielson, William F.; Ibrahim, Yehia; Smith, Richard D.
2008-01-01
Due to the inherently low duty cycle of ion mobility spectrometry (IMS) experiments that sample from continuous ion sources, a range of experimental advances have been developed to maximize ion utilization efficiency. The use of ion trapping mechanisms prior to the ion mobility drift tube has demonstrated significant gains over discrete sampling from continuous sources; however, these technologies have traditionally relied upon a signal averaging to attain analytically relevant signal-to-noise ratios (SNR). Multiplexed (MP) techniques based upon the Hadamard transform offer an alternative experimental approach by which ion utilization efficiency can be elevated to ∼ 50 %. Recently, our research group demonstrated a unique multiplexed ion mobility time-of-flight (MP-IMS-TOF) approach that incorporates ion trapping and can extend ion utilization efficiency beyond 50 %. However, the spectral reconstruction of the multiplexed signal using this experiment approach requires the use of sample-specific weighing designs. Though general weighing designs have been shown to significantly enhance ion utilization efficiency using this MP technique, such weighing designs cannot be applied to all samples. By modifying both the ion funnel trap and the pseudo random sequence (PRS) used for the MP experiment we have eliminated the need for complex weighing matrices. For both simple and complex mixtures SNR enhancements of up to 13 were routinely observed as compared to the SA-IMS-TOF experiment. In addition, this new class of PRS provides a two fold enhancement in ion throughput compared to the traditional HT-IMS experiment. PMID:18311942
NASA Technical Reports Server (NTRS)
Scott, Elaine P.
1996-01-01
A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.
Design of a Pressure Sensor Based on Optical Fiber Bragg Grating Lateral Deformation
Urban, Frantisek; Kadlec, Jaroslav; Vlach, Radek; Kuchta, Radek
2010-01-01
This paper describes steps involved in the design and realization of a new type of pressure sensor based on the optical fiber Bragg grating. A traditional pressure sensor has very limited usage in heavy industrial environments, particularly in explosive or electromagnetically noisy environments. Utilization of optics in these environments eliminates all surrounding influences. An initial motivation for our development was the research, experimental validation, and realization of a complex smart pressure sensor based on the optical principle. The main benefit of this solution consists of increasing sensitivity, resistance to electromagnetic interference, dimensions, and potential increased accuracy. PMID:22163521
RAMP: A fault tolerant distributed microcomputer structure for aircraft navigation and control
NASA Technical Reports Server (NTRS)
Dunn, W. R.
1980-01-01
RAMP consists of distributed sets of parallel computers partioned on the basis of software and packaging constraints. To minimize hardware and software complexity, the processors operate asynchronously. It was shown that through the design of asymptotically stable control laws, data errors due to the asynchronism were minimized. It was further shown that by designing control laws with this property and making minor hardware modifications to the RAMP modules, the system became inherently tolerant to intermittent faults. A laboratory version of RAMP was constructed and is described in the paper along with the experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ievleva, J.I.; Kolesnikov, V.P.; Mezhertisky, G.S.
1996-04-01
The main direction of science investigations for creation of efficient solid oxide fuel cells (SOFC) in IPPE are considered in this work. The development program of planar SOFC with thin-film electrolyte is shown. General design schemes of experimental SOFC units are presented. The flow design schemes of processes for initial materials and electrodes fabrication are shown. The results of investigations for creation thin-film solid oxide electrolyte at porous cathode by magnetron sputtering from complex metal target in oxidative environment are presented.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
Studies on the Processing Methods for Extraterrestrial Materials
NASA Technical Reports Server (NTRS)
Grimley, R. T.; Lipschutz, M. E.
1984-01-01
The literature was surveyed for high temperature mass spectrometric research on single oxides, complex oxides, and minerals in an effort to develop a means of separating elements and compounds from lunar and other extraterrestrial materials. A data acquisition system for determining vaporization rates as a function of time and temperature and software for the IEEE-488 Apple-ORTEC interface are discussed. Experimental design information from a 1000 C furnace were used with heat transfer calculations to develop the basic design for a 1600 C furnace. A controller was built for the higher temperature furnace and drawings are being made for the furnace.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-31
Research in the initial grant period focused on computational studies relevant to the selective activation of methane, the prime component of natural gas. Reaction coordinates for methane activation by experimental models were delineated, as well as the bonding and structure of complexes that effect this important reaction. This research, highlighted in the following sections, also provided the impetus for further development, and application of methods for modeling metal-containing catalysts. Sections of the report describe the following: methane activation by multiple-bonded transition metal complexes; computational lanthanide chemistry; and methane activation by non-imido, multiple-bonded ligands.
Non-linear molecular pattern classification using molecular beacons with multiple targets.
Lee, In-Hee; Lee, Seung Hwan; Park, Tai Hyun; Zhang, Byoung-Tak
2013-12-01
In vitro pattern classification has been highlighted as an important future application of DNA computing. Previous work has demonstrated the feasibility of linear classifiers using DNA-based molecular computing. However, complex tasks require non-linear classification capability. Here we design a molecular beacon that can interact with multiple targets and experimentally shows that its fluorescent signals form a complex radial-basis function, enabling it to be used as a building block for non-linear molecular classification in vitro. The proposed method was successfully applied to solving artificial and real-world classification problems: XOR and microRNA expression patterns. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Cho, Sun-Joo; Goodwin, Amanda P
2016-04-01
When word learning is supported by instruction in experimental studies for adolescents, word knowledge outcomes tend to be collected from complex data structure, such as multiple aspects of word knowledge, multilevel reader data, multilevel item data, longitudinal design, and multiple groups. This study illustrates how generalized linear mixed models can be used to measure and explain word learning for data having such complexity. Results from this application provide deeper understanding of word knowledge than could be attained from simpler models and show that word knowledge is multidimensional and depends on word characteristics and instructional contexts.
Versatile fluid-mixing device for cell and tissue microgravity research applications.
Wilfinger, W W; Baker, C S; Kunze, E L; Phillips, A T; Hammerstedt, R H
1996-01-01
Microgravity life-science research requires hardware that can be easily adapted to a variety of experimental designs and working environments. The Biomodule is a patented, computer-controlled fluid-mixing device that can accommodate these diverse requirements. A typical shuttle payload contains eight Biomodules with a total of 64 samples, a sealed containment vessel, and a NASA refrigeration-incubation module. Each Biomodule contains eight gas-permeable Silastic T tubes that are partitioned into three fluid-filled compartments. The fluids can be mixed at any user-specified time. Multiple investigators and complex experimental designs can be easily accommodated with the hardware. During flight, the Biomodules are sealed in a vessel that provides two levels of containment (liquids and gas) and a stable, investigator-controlled experimental environment that includes regulated temperature, internal pressure, humidity, and gas composition. A cell microencapsulation methodology has also been developed to streamline launch-site sample manipulation and accelerate postflight analysis through the use of fluorescent-activated cell sorting. The Biomodule flight hardware and analytical cell encapsulation methodology are ideally suited for temporal, qualitative, or quantitative life-science investigations.
Advanced Methods for Aircraft Engine Thrust and Noise Benefits: Nozzle-Inlet Flow Analysis
NASA Technical Reports Server (NTRS)
Morgan, Morris H.; Gilinsky, Mikhail; Patel, Kaushal; Coston, Calvin; Blankson, Isaiah M.
2003-01-01
The research is focused on a wide regime of problems in the propulsion field as well as in experimental testing and theoretical and numerical simulation analyses for advanced aircraft and rocket engines. Results obtained are based on analytical methods, numerical simulations and experimental tests at the NASA LaRC and Hampton University computer complexes and experimental facilities. The main objective of this research is injection, mixing and combustion enhancement in propulsion systems. The sub-projects in the reporting period are: (A) Aero-performance and acoustics of Telescope-shaped designs. The work included a pylon set application for SCRAMJET. (B) An analysis of sharp-edged nozzle exit designs for effective fuel injection into the flow stream in air-breathing engines: triangular-round and diamond-round nozzles. (C) Measurement technique improvements for the HU Low Speed Wind Tunnel (HU LSWT) including an automatic data acquisition system and a two component (drag-lift) balance system. In addition, a course in the field of aerodynamics was developed for the teaching and training of HU students.
Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps
2016-04-12
are likely to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be...complexity, such as an improvised nuclear device (IND) detonation. The effort has examined game - based training methods to determine their suitability
JPRS Report, Soviet Union, Economic Affairs
1988-10-18
34Commodities—The Mirror of Cost Accounting"] [Text] A number of large-scale decisions directed toward increasing the production of high-quality...suitable in the sphere of scientific research and experimental design work. It is known, for example, that the number of blueprints , specifications, or...the situation, Yu. Kozyrev , deputy chief of the Department for Problems of the Machine Building Complex of the USSR State Committee for Science and
Eves, E Eugene; Murphy, Ethan K; Yakovlev, Vadim V
2007-01-01
The paper discusses characteristics of a new modeling-based technique for determining dielectric properties of materials. Complex permittivity is found with an optimization algorithm designed to match complex S-parameters obtained from measurements and from 3D FDTD simulation. The method is developed on a two-port (waveguide-type) fixture and deals with complex reflection and transmission characteristics at the frequency of interest. A computational part is constructed as an inverse-RBF-network-based procedure that reconstructs dielectric constant and the loss factor of the sample from the FDTD modeling data sets and the measured reflection and transmission coefficients. As such, it is applicable to samples and cavities of arbitrary configurations provided that the geometry of the experimental setup is adequately represented by the FDTD model. The practical implementation of the method considered in this paper is a section of a WR975 waveguide containing a sample of a liquid in a cylindrical cutout of a rectangular Teflon cup. The method is run in two stages and employs two databases--first, built for a sparse grid on the complex permittivity plane, in order to locate a domain with an anticipated solution and, second, made as a denser grid covering the determined domain, for finding an exact location of the complex permittivity point. Numerical tests demonstrate that the computational part of the method is highly accurate even when the modeling data is represented by relatively small data sets. When working with reflection and transmission coefficients measured in an actual experimental fixture and reconstructing a low dielectric constant and the loss factor the technique may be less accurate. It is shown that the employed neural network is capable of finding complex permittivity of the sample when experimental data on the reflection and transmission coefficients are numerically dispersive (noise-contaminated). A special modeling test is proposed for validating the results; it confirms that the values of complex permittivity for several liquids (including salt water acetone and three types of alcohol) at 915 MHz are reconstructed with satisfactory accuracy.
Zhang, Wenrui; Li, Mingtao; Chen, Aiping; ...
2016-06-13
Two-dimensional (2D) nanostructures emerge as one of leading topics in fundamental materials science and could enable next generation nanoelectronic devices. Beyond graphene and molybdenum disulphide, layered complex oxides are another large group of promising 2D candidates because of their strong interplay of intrinsic charge, spin, orbital and lattice. As a fundamental basis of heteroepitaxial thin film growth, interfacial strain can be used to design materials exhibiting new phenomena beyond their conventional form. Here we report the strain-driven self-assembly of Bismuth-based supercells (SC) with a 2D layered structure, and elucidate the fundamental growth mechanism with combined experimental tools and first-principles calculations.more » The study revealed that the new layered structures were formed by the strain-enabled self-assembled atomic layer stacking, i.e., alternative growth of Bi 2O 2 layer and [Fe 0.5Mn 0.5]O 6 layer. The strain-driven approach is further demonstrated in other SC candidate systems with promising room-temperature multiferroic properties. This well-integrated theoretical and experimental study inspired by the Materials Genome Initiatives opens up a new avenue in searching and designing novel 2D layered complex oxides with enormous promises.« less
Device design and signal processing for multiple-input multiple-output multimode fiber links
NASA Astrophysics Data System (ADS)
Appaiah, Kumar; Vishwanath, Sriram; Bank, Seth R.
2012-01-01
Multimode fibers (MMFs) are limited in data rate capabilities owing to modal dispersion. However, their large core diameter simplifies alignment and packaging, and makes them attractive for short and medium length links. Recent research has shown that the use of signal processing and techniques such as multiple-input multiple-output (MIMO) can greatly improve the data rate capabilities of multimode fibers. In this paper, we review recent experimental work using MIMO and signal processing for multimode fibers, and the improvements in data rates achievable with these techniques. We then present models to design as well as simulate the performance benefits obtainable with arrays of lasers and detectors in conjunction with MIMO, using channel capacity as the metric to optimize. We also discuss some aspects related to complexity of the algorithms needed for signal processing and discuss techniques for low complexity implementation.
Landrum, Peter F; Chapman, Peter M; Neff, Jerry; Page, David S
2012-04-01
Experimental designs for evaluating complex mixture toxicity in aquatic environments can be highly variable and, if not appropriate, can produce and have produced data that are difficult or impossible to interpret accurately. We build on and synthesize recent critical reviews of mixture toxicity using lessons learned from 4 case studies, ranging from binary to more complex mixtures of primarily polycyclic aromatic hydrocarbons and petroleum hydrocarbons, to provide guidance for evaluating the aquatic toxicity of complex mixtures of organic chemicals. Two fundamental requirements include establishing a dose-response relationship and determining the causative agent (or agents) of any observed toxicity. Meeting these 2 requirements involves ensuring appropriate exposure conditions and measurement endpoints, considering modifying factors (e.g., test conditions, test organism life stages and feeding behavior, chemical transformations, mixture dilutions, sorbing phases), and correctly interpreting dose-response relationships. Specific recommendations are provided. Copyright © 2011 SETAC.
Extending Quantum Chemistry of Bound States to Electronic Resonances
NASA Astrophysics Data System (ADS)
Jagau, Thomas-C.; Bravaya, Ksenia B.; Krylov, Anna I.
2017-05-01
Electronic resonances are metastable states with finite lifetime embedded in the ionization or detachment continuum. They are ubiquitous in chemistry, physics, and biology. Resonances play a central role in processes as diverse as DNA radiolysis, plasmonic catalysis, and attosecond spectroscopy. This review describes novel equation-of-motion coupled-cluster (EOM-CC) methods designed to treat resonances and bound states on an equal footing. Built on complex-variable techniques such as complex scaling and complex absorbing potentials that allow resonances to be associated with a single eigenstate of the molecular Hamiltonian rather than several continuum eigenstates, these methods extend electronic-structure tools developed for bound states to electronic resonances. Selected examples emphasize the formal advantages as well as the numerical accuracy of EOM-CC in the treatment of electronic resonances. Connections to experimental observables such as spectra and cross sections, as well as practical aspects of implementing complex-valued approaches, are also discussed.
PLI: a web-based tool for the comparison of protein-ligand interactions observed on PDB structures.
Gallina, Anna Maria; Bisignano, Paola; Bergamino, Maurizio; Bordo, Domenico
2013-02-01
A large fraction of the entries contained in the Protein Data Bank describe proteins in complex with low molecular weight molecules such as physiological compounds or synthetic drugs. In many cases, the same molecule is found in distinct protein-ligand complexes. There is an increasing interest in Medicinal Chemistry in comparing protein binding sites to get insight on interactions that modulate the binding specificity, as this structural information can be correlated with other experimental data of biochemical or physiological nature and may help in rational drug design. The web service protein-ligand interaction presented here provides a tool to analyse and compare the binding pockets of homologous proteins in complex with a selected ligand. The information is deduced from protein-ligand complexes present in the Protein Data Bank and stored in the underlying database. Freely accessible at http://bioinformatics.istge.it/pli/.
Cala, Antonio; Molinillo, José M G; Fernández-Aparicio, Mónica; Ayuso, Jesús; Álvarez, José A; Rubiales, Diego; Macías, Francisco A
2017-08-09
Allelochemicals are safer, more selective and more active alternatives than synthetic agrochemicals for weed control. However, the low solubility of these compounds in aqueous media limits their use as agrochemicals. Herein, we propose the application of α-, β- and γ-cyclodextrins to improve the physicochemical properties and biological activities of three sesquiterpene lactones: dehydrocostuslactone, costunolide and (-)-α-santonin. Complexation was achieved by kneading and coprecipitation methods. Aqueous solubility was increased in the range 100-4600% and the solubility-phase diagrams suggested that complex formation had been successful. The results of the PM3 semiempirical calculations were consistent with the experimental results. The activities on etiolated wheat coleoptiles, Standard Target Species and parasitic weeds were improved. Cyclodextrins preserved or enhanced the activity of the three sesquiterpene lactones. Free cyclodextrins did not show significant activity and therefore the enhancement in activity was due to complexation. These results are promising for applications in agrochemical design.
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
Complex Water Impact Visitor Information Validation and Qualification Sciences Experimental Complex Our the problem space. The Validation and Qualification Sciences Experimental Complex (VQSEC) at Sandia
NASA Astrophysics Data System (ADS)
Heon Kim, Tae; Yoon, Jong-Gul; Hyub Baek, Seung; Park, Woong-Kyu; Mo Yang, Sang; Yup Jang, Seung; Min, Taeyuun; Chung, Jin-Seok; Eom, Chang-Beom; Won Noh, Tae
2015-07-01
Fundamental understanding of domain dynamics in ferroic materials has been a longstanding issue because of its relevance to many systems and to the design of nanoscale domain-wall devices. Despite many theoretical and experimental studies, a full understanding of domain dynamics still remains incomplete, partly due to complex interactions between domain-walls and disorder. We report domain-shape-preserving deterministic domain-wall motion, which directly confirms microscopic return point memory, by observing domain-wall breathing motion in ferroelectric BiFeO3 thin film using stroboscopic piezoresponse force microscopy. Spatial energy landscape that provides new insights into domain dynamics is also mapped based on the breathing motion of domain walls. The evolution of complex domain structure can be understood by the process of occupying the lowest available energy states of polarization in the energy landscape which is determined by defect-induced internal fields. Our result highlights a pathway for the novel design of ferroelectric domain-wall devices through the engineering of energy landscape using defect-induced internal fields such as flexoelectric fields.
Kim, Tae Heon; Yoon, Jong-Gul; Baek, Seung Hyub; Park, Woong-kyu; Yang, Sang Mo; Yup Jang, Seung; Min, Taeyuun; Chung, Jin-Seok; Eom, Chang-Beom; Noh, Tae Won
2015-07-01
Fundamental understanding of domain dynamics in ferroic materials has been a longstanding issue because of its relevance to many systems and to the design of nanoscale domain-wall devices. Despite many theoretical and experimental studies, a full understanding of domain dynamics still remains incomplete, partly due to complex interactions between domain-walls and disorder. We report domain-shape-preserving deterministic domain-wall motion, which directly confirms microscopic return point memory, by observing domain-wall breathing motion in ferroelectric BiFeO3 thin film using stroboscopic piezoresponse force microscopy. Spatial energy landscape that provides new insights into domain dynamics is also mapped based on the breathing motion of domain walls. The evolution of complex domain structure can be understood by the process of occupying the lowest available energy states of polarization in the energy landscape which is determined by defect-induced internal fields. Our result highlights a pathway for the novel design of ferroelectric domain-wall devices through the engineering of energy landscape using defect-induced internal fields such as flexoelectric fields.
Heon Kim, Tae; Yoon, Jong-Gul; Hyub Baek, Seung; Park, Woong-kyu; Mo Yang, Sang; Yup Jang, Seung; Min, Taeyuun; Chung, Jin-Seok; Eom, Chang-Beom; Won Noh, Tae
2015-01-01
Fundamental understanding of domain dynamics in ferroic materials has been a longstanding issue because of its relevance to many systems and to the design of nanoscale domain-wall devices. Despite many theoretical and experimental studies, a full understanding of domain dynamics still remains incomplete, partly due to complex interactions between domain-walls and disorder. We report domain-shape-preserving deterministic domain-wall motion, which directly confirms microscopic return point memory, by observing domain-wall breathing motion in ferroelectric BiFeO3 thin film using stroboscopic piezoresponse force microscopy. Spatial energy landscape that provides new insights into domain dynamics is also mapped based on the breathing motion of domain walls. The evolution of complex domain structure can be understood by the process of occupying the lowest available energy states of polarization in the energy landscape which is determined by defect-induced internal fields. Our result highlights a pathway for the novel design of ferroelectric domain-wall devices through the engineering of energy landscape using defect-induced internal fields such as flexoelectric fields. PMID:26130159
Borglin, Gunilla; Gustafsson, Markus; Krona, Hans
2011-09-23
Pain is one of the most frequent problems among patients diagnosed with cancer. Despite the availability of effective pharmacological treatments, this group of patients often receives less than optimal treatment. Research into nurses' pain management highlights certain factors, such as lack of knowledge and attitudes and inadequate procedures for systematic pain assessment, as common barriers to effective pain management. However, educational interventions targeting nurses' pain management have shown promise. As cancer-related pain is also known to have a negative effect on vital aspects of the patient's life, as well as being commonly associated with problems such as sleep, fatigue, depression and anxiety, further development of knowledge within this area is warranted. A quasi-experimental study design will be used to investigate whether the implementation of guidelines for systematic daily pain assessments following a theory-based educational intervention will result in an improvement in knowledge and attitude among nurses. A further aim is to investigate whether the intervention that targets nurses' behaviour will improve hospital patients' perception of pain. Data regarding nurses' knowledge and attitudes to pain (primary outcome), patient perception regarding pain (secondary outcome), together with socio-demographic variables, will be collected at baseline and at four weeks and 12 weeks following the intervention. Nursing care is nowadays acknowledged as an increasingly complicated activity and "nursing complexity is such that it can be seen as the quintessential complex intervention." To be able to change and improve clinical practice thus requires multiple points of attack appropriate to meet complex challenges. Consequently, we expect the theory-based intervention used in our quasi-experimental study to improve care as well as quality of life for this group of patients and we also envisage that evidence-based guidelines targeting this patient group's pain will be implemented more widely. ClinicalTrials.gov NCT01313234.
NASA Astrophysics Data System (ADS)
Vintila, Iuliana; Gavrus, Adinel
2017-10-01
The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).
Subjective responses of mental workload during real time driving: A pilot field study
NASA Astrophysics Data System (ADS)
Rahman, N. I. A.; Dawal, S. Z. M.; Yusoff, N.
2017-06-01
This study evaluated drivers’ mental workload in real time driving to identify the driving situation’s complexity influences in an attempt to further design on a complete experimental study. Three driving settings were prepared: Session A (simple situation); Session B (moderately complex situation); Session C (very complex situation). To determine the mental workload, the NASA-Task Load Index (TLX) was administered to four drivers after each experimental driving session. The results showed that the Own Performance (OP) was the highest for session A (highway), while Physical Demand (PD) recorded the highest mean workload score across the session B (rural road) and C (city road). Based on the overall results of the study, it can be concluded that the highway is less demanding compared to rural and city road. It can be highlighted in this study that in the rural and city road driving situation, the timing must be set correctly to assure the relevant traffic density. Thus, the sensitivity of the timing must be considered in the future experiment. A larger number of experience drivers must be used in evaluating the driving situations to provide results that can be used to draw more realistic experiments and conclusions.
Statistical and sampling issues when using multiple particle tracking
NASA Astrophysics Data System (ADS)
Savin, Thierry; Doyle, Patrick S.
2007-08-01
Video microscopy can be used to simultaneously track several microparticles embedded in a complex material. The trajectories are used to extract a sample of displacements at random locations in the material. From this sample, averaged quantities characterizing the dynamics of the probes are calculated to evaluate structural and/or mechanical properties of the assessed material. However, the sampling of measured displacements in heterogeneous systems is singular because the volume of observation with video microscopy is finite. By carefully characterizing the sampling design in the experimental output of the multiple particle tracking technique, we derive estimators for the mean and variance of the probes’ dynamics that are independent of the peculiar statistical characteristics. We expose stringent tests of these estimators using simulated and experimental complex systems with a known heterogeneous structure. Up to a certain fundamental limitation, which we characterize through a material degree of sampling by the embedded probe tracking, these estimators can be applied to quantify the heterogeneity of a material, providing an original and intelligible kind of information on complex fluid properties. More generally, we show that the precise assessment of the statistics in the multiple particle tracking output sample of observations is essential in order to provide accurate unbiased measurements.
Wahman, David G; Speitel, Gerald E; Katz, Lynn E
2017-11-21
Chloramine chemistry is complex, with a variety of reactions occurring in series and parallel and many that are acid or base catalyzed, resulting in numerous rate constants. Bromide presence increases system complexity even further with possible bromamine and bromochloramine formation. Therefore, techniques for parameter estimation must address this complexity through thoughtful experimental design and robust data analysis approaches. The current research outlines a rational basis for constrained data fitting using Brønsted theory, application of the microscopic reversibility principle to reversible acid or base catalyzed reactions, and characterization of the relative significance of parallel reactions using fictive product tracking. This holistic approach was used on a comprehensive and well-documented data set for bromamine decomposition, allowing new interpretations of existing data by revealing that a previously published reaction scheme was not robust; it was not able to describe monobromamine or dibromamine decay outside of the conditions for which it was calibrated. The current research's simplified model (3 reactions, 17 constants) represented the experimental data better than the previously published model (4 reactions, 28 constants). A final model evaluation was conducted based on representative drinking water conditions to determine a minimal model (3 reactions, 8 constants) applicable for drinking water conditions.
Maghsoudi, Amirhossein; Fakharzadeh, Saideh; Hafizi, Maryam; Abbasi, Maryam; Kohram, Fatemeh; Sardab, Shima; Tahzibi, Abbas; Kalanaky, Somayeh; Nazaran, Mohammad Hassan
2015-03-01
Parkinson's disease (PD) is the world's second most common dementia, which the drugs available for its treatment have not had effects beyond slowing the disease process. Recently nanotechnology has induced the chance for designing and manufacturing new medicines for neurodegenerative disease. It is demonstrated that by tuning the size of a nanoparticle, the physiological effect of the nanoparticle can be controlled. Using novel nanochelating technology, three nano complexes: Pas (150 nm), Paf (100 nm) and Pac (40 nm) were designed and in the present study their neuroprotective effects were evaluated in PC12 cells treated with 1-methyl-4-phenyl-pyridine ion (MPP (+)). PC12 cells were pre-treated with the Pas, Paf or Pac nano complexes, then they were subjected to 10 μM MPP (+). Subsequently, cell viability, intracellular free Calcium and reactive oxygen species (ROS) levels, mitochondrial membrane potential, catalase (CAT) and superoxide dismutase (SOD) activity, Glutathione (GSH) and malondialdehyde (MDA) levels and Caspase 3 expression were evaluated. All three nano complexes, especially Pac, were able to increase cell viability, SOD and CAT activity, decreased Caspase 3 expression and prevented the generation of ROS and the loss of mitochondrial membrane potential caused by MPP(+). Pre-treatment with Pac and Paf nano complexes lead to a decrease of intracellular free Calcium, but Pas nano complex could not decrease it. Only Pac nano complex decreased MDA levels and other nano complexes could not change this parameter compared to MPP(+) treated cells. Hence according to the results, all nanochelating based nano complexes induced neuroprotective effects in an experimental model of PD, but the smallest nano complex, Pac, showed the best results.
Optimal cooperative control synthesis of active displays
NASA Technical Reports Server (NTRS)
Garg, S.; Schmidt, D. K.
1985-01-01
A technique is developed that is intended to provide a systematic approach to synthesizing display augmentation for optimal manual control in complex, closed-loop tasks. A cooperative control synthesis technique, previously developed to design pilot-optimal control augmentation for the plant, is extended to incorporate the simultaneous design of performance enhancing displays. The technique utilizes an optimal control model of the man in the loop. It is applied to the design of a quickening control law for a display and a simple K/s(2) plant, and then to an F-15 type aircraft in a multi-channel task. Utilizing the closed loop modeling and analysis procedures, the results from the display design algorithm are evaluated and an analytical validation is performed. Experimental validation is recommended for future efforts.
ICS-II USA research design and methodology.
Rana, H; Andersen, R M; Nakazono, T T; Davidson, P L
1997-05-01
The purpose of the WHO-sponsored International Collaborative Study of Oral Health Outcomes (ICS-II) was to provide policy-markers and researchers with detailed, reliable, and valid data on the oral health situation in their countries or regions, together with comparative data from other dental care delivery systems. ICS-II used a cross-sectional design with no explicit control groups or experimental interventions. A standardized methodology was developed and tested for collecting and analyzing epidemiological, sociocultural, economic, and delivery system data. Respondent information was obtained by household interviews, and clinical examinations were conducted by calibrated oral epidemiologists. Discussed are the sampling design characteristics for the USA research locations, response rates, samples size for interview and oral examination data, weighting procedures, and statistical methods. SUDAAN was used to adjust variance calculations, since complex sampling designs were used.
Modeling of substrate and inhibitor binding to phospholipase A2.
Sessions, R B; Dauber-Osguthorpe, P; Campbell, M M; Osguthorpe, D J
1992-09-01
Molecular graphics and molecular mechanics techniques have been used to study the mode of ligand binding and mechanism of action of the enzyme phospholipase A2. A substrate-enzyme complex was constructed based on the crystal structure of the apoenzyme. The complex was minimized to relieve initial strain, and the structural and energetic features of the resultant complex analyzed in detail, at the molecular and residue level. The minimized complex was then used as a basis for examining the action of the enzyme on modified substrates, binding of inhibitors to the enzyme, and possible reaction intermediate complexes. The model is compatible with the suggested mechanism of hydrolysis and with experimental data about stereoselectivity, efficiency of hydrolysis of modified substrates, and inhibitor potency. In conclusion, the model can be used as a tool in evaluating new ligands as possible substrates and in the rational design of inhibitors, for the therapeutic treatment of diseases such as rheumatoid arthritis, atherosclerosis, and asthma.
Actuator with built-in viscous damping for isolation and structural control
NASA Astrophysics Data System (ADS)
Hyde, T. Tupper; Anderson, Eric H.
1994-05-01
This paper describes the development and experimental application of an actuator with built-in viscous damping. An existing passive damper was modified for use as a novel actuation device for isolation and structural control. The device functions by using the same fluid for viscous damping and as a hydraulic lever for a voice coil actuator. Applications for such an actuator include structural control and active isolation. Lumped parameter models capturing structural and fluid effects are presented. Component tests of free stroke, blocked force, and passive complex stiffness are used to update the assumed model parameters. The structural damping effectiveness of the new actuator is shown to be that of a regular D-strut passively and that of a piezoelectric strut with load cell feedback actively in a complex testbed structure. Open and closed loop results are presented for a force isolation application showing an 8 dB passive and 20 dB active improvement over an undamped mount. An optimized design for a future experimental testbed is developed.
Using Ultrasonic Speckle Velocimetry to Detect Fluid Instabilities in a Surfactant Solution
NASA Astrophysics Data System (ADS)
Bice, Jason E.
Rheometry is a leading technology used to define material properties of multi-phase viscoelastic fluid-like materials, such as the shear modulus and viscosity. However, traditional rheometry relies on a mechanical response from a rotating or oscillating rotor of various geometries which does not allow for any spatial or temporal quantification of the material characteristics. Further, the setup operates under the assumption of a uniform and homogeneous flow. Thus, only qualitative deductions can be realized when a complex fluid displays inhomogeneous behavior, such as wall slip or shear banding. Due to this lack of capability, non-intrusive imaging is required to define and quantify behavior that occurs in a complex fluid under shear conditions. This thesis outlines the design, fabrication, and experimental examples of an adapted ultrasonic speckle velocimetry device, which enables spatial and temporal resolution of inhomogeneous fluid behavior using ultrasound acoustics. For the experimental example, a commercial surfactant mixture (hair shampoo) was tested to show the utility and precision that ultrasonic speckle velocimetry possesses.
Playing evolution in the laboratory: From the first major evolutionary transition to global warming
NASA Astrophysics Data System (ADS)
Fragata, Inês; Simões, Pedro; Matos, Margarida; Szathmáry, Eörs; Santos, Mauro
2018-05-01
Experimental evolution allows testing hypotheses derived from theory or from observed patterns in nature. We have designed a droplet-based microfluidic “evolution machine” to test how transient compartmentalization (“trait-groups”) of independent molecular replicators (likely a critical step in the origin of life) could have prevented the spread of parasitic mutants; that is, inactive RNAs that have been reported to spoil a system of free replicators. In remarkable agreement with the theory, we show that this simple population structure was sufficient to prevent takeover by inactive RNAs. A more complex scenario arises when we use experimental evolution to test field-derived hypotheses; for instance, the idea that temperature is driving genetic spatiotemporal patterns of climate change. In the fly Drosophila subobscura, latitudinal clines in gene arrangement frequencies occur worldwide, and more equatorial gene arrangements are becoming more frequent at higher latitudes as a correlated response to climate change. However, the evolution at different constant temperatures in the laboratory was not consistent with patterns in nature, suggesting some limitations of experimental evolution. Finally, also in D. subobscura, we show that repeatability in experimental evolution is staggeringly consistent for life history traits, making evolution quite predictable and suggesting that laboratory selection can quickly erase differences between populations. Yet, the genetic paths used to attain the same adaptive phenotypes are complex and unpredictable. Contribution to the Focus Issue Evolutionary Modeling and Experimental Evolution edited by José Cuesta, Joachim Krug and Susanna Manrubia.
Dubovi, Ilana; Dagan, Efrat; Sader Mazbar, Ola; Nassar, Laila; Levy, Sharona T
2018-02-01
Pharmacology is a crucial component of medications administration in nursing, yet nursing students generally find it difficult and self-rate their pharmacology skills as low. To evaluate nursing students learning pharmacology with the Pharmacology Inter-Leaved Learning-Cells environment, a novel approach to modeling biochemical interactions using a multiscale, computer-based model with a complexity perspective based on a small set of entities and simple rules. This environment represents molecules, organelles and cells to enhance the understanding of cellular processes, and combines these cells at a higher scale to obtain whole-body interactions. Sophomore nursing students who learned the pharmacology of diabetes mellitus with the Pharmacology Inter-Leaved Learning-Cells environment (experimental group; n=94) or via a lecture-based curriculum (comparison group; n=54). A quasi-experimental pre- and post-test design was conducted. The Pharmacology-Diabetes-Mellitus questionnaire and the course's final exam were used to evaluate students' knowledge of the pharmacology of diabetes mellitus. Conceptual learning was significantly higher for the experimental than for the comparison group for the course final exam scores (unpaired t=-3.8, p<0.001) and for the Pharmacology-Diabetes-Mellitus questionnaire (U=942, p<0.001). The largest effect size for the Pharmacology-Diabetes-Mellitus questionnaire was for the medication action subscale. Analysis of complex-systems component reasoning revealed a significant difference for micro-macro transitions between the levels (F(1, 82)=6.9, p<0.05). Learning with complexity-based computerized models is highly effective and enhances the understanding of moving between micro and macro levels of the biochemical phenomena, this is then related to better understanding of medication actions. Moreover, the Pharmacology Inter-Leaved Learning-Cells approach provides a more general reasoning scheme for biochemical processes, which enhances pharmacology learning beyond the specific topic learned. The present study implies that deeper understanding of pharmacology will support nursing students' clinical decisions and empower their proficiency in medications administration. Copyright © 2017 Elsevier Ltd. All rights reserved.
Impact of scaffold rigidity on the design and evolution of an artificial Diels-Alderase
Preiswerk, Nathalie; Beck, Tobias; Schulz, Jessica D.; Milovník, Peter; Mayer, Clemens; Siegel, Justin B.; Baker, David; Hilvert, Donald
2014-01-01
By combining targeted mutagenesis, computational refinement, and directed evolution, a modestly active, computationally designed Diels-Alderase was converted into the most proficient biocatalyst for [4+2] cycloadditions known. The high stereoselectivity and minimal product inhibition of the evolved enzyme enabled preparative scale synthesis of a single product diastereomer. X-ray crystallography of the enzyme–product complex shows that the molecular changes introduced over the course of optimization, including addition of a lid structure, gradually reshaped the pocket for more effective substrate preorganization and transition state stabilization. The good overall agreement between the experimental structure and the original design model with respect to the orientations of both the bound product and the catalytic side chains contrasts with other computationally designed enzymes. Because design accuracy appears to correlate with scaffold rigidity, improved control over backbone conformation will likely be the key to future efforts to design more efficient enzymes for diverse chemical reactions. PMID:24847076
Flow in prosthetic heart valves: state-of-the-art and future directions.
Yoganathan, Ajit P; Chandran, K B; Sotiropoulos, Fotis
2005-12-01
Since the first successful implantation of a prosthetic heart valve four decades ago, over 50 different designs have been developed including both mechanical and bioprosthetic valves. Today, the most widely implanted design is the mechanical bileaflet, with over 170,000 implants worldwide each year. Several different mechanical valves are currently available and many of them have good bulk forward flow hemodynamics, with lower transvalvular pressure drops, larger effective orifice areas, and fewer regions of forward flow stasis than their earlier-generation counterparts such as the ball-and-cage and tilting-disc valves. However, mechanical valve implants suffer from complications resulting from thrombus deposition and patients implanted with these valves need to be under long-term anti-coagulant therapy. In general, blood thinners are not needed with bioprosthetic implants, but tissue valves suffer from structural failure with, an average life-time of 10-12 years, before replacement is needed. Flow-induced stresses on the formed elements in blood have been implicated in thrombus initiation within the mechanical valve prostheses. Regions of stress concentration on the leaflets during the complex motion of the leaflets have been implicated with structural failure of the leaflets with bioprosthetic valves. In vivo and in vitro experimental studies have yielded valuable information on the relationship between hemodynamic stresses and the problems associated with the implants. More recently, Computational Fluid Dynamics (CFD) has emerged as a promising tool, which, alongside experimentation, can yield insights of unprecedented detail into the hemodynamics of prosthetic heart valves. For CFD to realize its full potential, however, it must rely on numerical techniques that can handle the enormous geometrical complexities of prosthetic devices with spatial and temporal resolution sufficiently high to accurately capture all hemodynamically relevant scales of motion. Such algorithms do not exist today and their development should be a major research priority. For CFD to further gain the confidence of valve designers and medical practitioners it must also undergo comprehensive validation with experimental data. Such validation requires the use of high-resolution flow measuring tools and techniques and the integration of experimental studies with CFD modeling.
Pozhitkov, Alex E; Noble, Peter A; Bryk, Jarosław; Tautz, Diethard
2014-01-01
Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance. Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements. The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations.
Simon, S; Higginson, I J
2009-01-01
Hospital palliative care teams (HPCTs) are well established as multi-professional services to provide palliative care in an acute hospital setting and are increasing in number. However, there is still limited evaluation of them, in terms of efficacy and effectiveness. The gold standard method of evaluation is a randomised control trial, but because of methodological (e.g., randomisation), ethical and practical difficulties such trials are often not possible. HPCT is a complex intervention, and the specific situation in palliative care makes it challenging to evaluate (e.g., distress and cognitive impairment of patients). The quasi-experimental before-after study design has the advantage of enabling an experimental character without randomisation. But this has other weaknesses and is prone to bias, for example, temporal trends and selection bias. As for every study design, avoidance and minimisation of bias is important to improve validity. Therefore, strategies of selecting an appropriate control group or time series and applying valid outcomes and measurement tools help reducing bias and strengthen the methods. Special attention is needed to plan and define the design and applied method.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
NASA Astrophysics Data System (ADS)
Ehlmann, Bryon K.
Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.
NASA Astrophysics Data System (ADS)
Dewalque, Florence; Schwartz, Cédric; Denoël, Vincent; Croisier, Jean-Louis; Forthomme, Bénédicte; Brüls, Olivier
2018-02-01
This paper studies the dynamics of tape springs which are characterised by a highly geometrical nonlinear behaviour including buckling, the formation of folds and hysteresis. An experimental set-up is designed to capture these complex nonlinear phenomena. The experimental data are acquired by the means of a 3D motion analysis system combined with a synchronised force plate. Deployment tests show that the motion can be divided into three phases characterised by different types of folds, frequencies of oscillation and damping behaviours. Furthermore, the reproducibility quality of the dynamic and quasi-static results is validated by performing a large number of tests. In parallel, a nonlinear finite element model is developed. The required model parameters are identified based on simple experimental tests such as static deformed configurations and small amplitude vibration tests. In the end, the model proves to be well correlated with the experimental results in opposite sense bending, while in equal sense, both the experimental set-up and the numerical model are particularly sensitive to the initial conditions.
Ecologically Enhancing Coastal Infrastructure
NASA Astrophysics Data System (ADS)
Mac Arthur, Mairi; Naylor, Larissa; Hansom, Jim; Burrows, Mike; Boyd, Ian
2017-04-01
Hard engineering structures continue to proliferate in the coastal zone globally in response to increasing pressures associated with rising sea levels, coastal flooding and erosion. These structures are typically plain-cast by design and function as poor ecological surrogates for natural rocky shores which are highly topographically complex and host a range of available microhabitats for intertidal species. Ecological enhancement mitigates some of these negative impacts by integrating components of nature into the construction and design of these structures to improve their sustainability, resilience and multifunctionality. In the largest UK ecological enhancement trial to date, 184 tiles (15x15cm) of up to nine potential designs were deployed on vertical concrete coastal infrastructure in 2016 at three sites across the UK (Saltcoats, Blackness and Isle of Wight). The surface texture and complexity of the tiles were varied to test the effect of settlement surface texture at the mm-cm scale of enhancement on the success of colonisation and biodiversity in the mid-upper intertidal zone in order to answer the following experimental hypotheses: • Tiles with mm-scale geomorphic complexity will have greater barnacle abundances • Tiles with cm-scale geomorphic complexity will have greater species richness than mm-scale tiles. A range of methods were used in creating the tile designs including terrestrial laser scanning of creviced rock surfaces to mimic natural rocky shore complexity as well as artificially generated complexity using computer software. The designs replicated the topographic features of high ecological importance found on natural rocky shores and promoted species recruitment and community composition on artificial surfaces; thus enabling us to evaluate biological responses to geomorphic complexity in a controlled field trial. At two of the sites, the roughest tile designs (cm scale) did not have the highest levels of barnacle recruits which were instead counted on tiles of intermediate roughness such as the grooved concrete with 257 recruits on average (n=8) at four months' post-installation (Saltcoats) and 1291 recruits at two months' post-installation (Isle of Wight). This indicates that a higher level of complexity does not always reflect the most appropriate roughness scale for some colonisers. On average, tiles with mm scale texture were more successful in terms of barnacle colonisation compared to plain-cast control tiles (n=8 per site). The poor performance of the control tiles (9 recruits, Saltcoats; 147 recruits, Isle of Wight after 4 and 2 months, respectively) further highlights that artificial, hard substrates are poor ecological surrogates for natural rocky shores. One of the sites, Blackness, was an observed outlier to the general trend of colonisation, likely due to its estuarine location. This factor may contribute to why every design, including the control tile, had high abundances of barnacles. Artificially designed tiles with cm-scale complexity had higher levels of species richness, with periwinkles and topshells frequently observed to utilise the tile microhabitats in greater numbers than found on other tile designs. These results show that the scale of geomorphic complexity influences early stage colonisation. Data analysis is being carried out between now and the EGU - these advanced analyses would be presented.
Coelho, Pedro G; Hollister, Scott J; Flanagan, Colleen L; Fernandes, Paulo R
2015-03-01
Bone scaffolds for tissue regeneration require an optimal trade-off between biological and mechanical criteria. Optimal designs may be obtained using topology optimization (homogenization approach) and prototypes produced using additive manufacturing techniques. However, the process from design to manufacture remains a research challenge and will be a requirement of FDA design controls to engineering scaffolds. This work investigates how the design to manufacture chain affects the reproducibility of complex optimized design characteristics in the manufactured product. The design and prototypes are analyzed taking into account the computational assumptions and the final mechanical properties determined through mechanical tests. The scaffold is an assembly of unit-cells, and thus scale size effects on the mechanical response considering finite periodicity are investigated and compared with the predictions from the homogenization method which assumes in the limit infinitely repeated unit cells. Results show that a limited number of unit-cells (3-5 repeated on a side) introduce some scale-effects but the discrepancies are below 10%. Higher discrepancies are found when comparing the experimental data to numerical simulations due to differences between the manufactured and designed scaffold feature shapes and sizes as well as micro-porosities introduced by the manufacturing process. However good regression correlations (R(2) > 0.85) were found between numerical and experimental values, with slopes close to 1 for 2 out of 3 designs. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Experimental Charging Behavior of Orion UltraFlex Array Designs
NASA Technical Reports Server (NTRS)
Golofaro, Joel T.; Vayner, Boris V.; Hillard, Grover B.
2010-01-01
The present ground based investigations give the first definitive look describing the charging behavior of Orion UltraFlex arrays in both the Low Earth Orbital (LEO) and geosynchronous (GEO) environments. Note the LEO charging environment also applies to the International Space Station (ISS). The GEO charging environment includes the bounding case for all lunar mission environments. The UltraFlex photovoltaic array technology is targeted to become the sole power system for life support and on-orbit power for the manned Orion Crew Exploration Vehicle (CEV). The purpose of the experimental tests is to gain an understanding of the complex charging behavior to answer some of the basic performance and survivability issues to ascertain if a single UltraFlex array design will be able to cope with the projected worst case LEO and GEO charging environments. Stage 1 LEO plasma testing revealed that all four arrays successfully passed arc threshold bias tests down to -240 V. Stage 2 GEO electron gun charging tests revealed that only the front side area of indium tin oxide coated array designs successfully passed the arc frequency tests
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1982-02-01
This paper describes the need for non-raytracing schemes in the optical design and analysis of large carbon-dioxide lasers like the Gigawatt,1 Gemini, 2 and Helios3 lasers currently operational at Los Alamos, and the Antares 4 laser fusion system under construction. The scheme currently used at Los Alamos involves characterizing the various optical components with a Zernike polynomial sets obtained by the digitization6 of experimentally produced interferograms of the components. A Fast Fourier Transform code then propagates the complex amplitude and phase of the beam through the whole system and computes the optical parameters of interest. The analysis scheme is illustrated through examples of the Gigawatt, Gemini, and Helios systems. A possible way of using the Zernike polynomials in optical design problems of this type is discussed. Comparisons between the computed values and experimentally obtained results are made and it is concluded that this appears to be a valid approach. As this is a review article, some previously published results are also used where relevant.
Experimental Design for Multi-drug Combination Studies Using Signaling Networks
Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.
2017-01-01
Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231
Patel, Disha; Bauman, Joseph D.; Arnold, Eddy
2015-01-01
X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this “chemical interrogation” of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. PMID:25117499
An integrated ball projection technology for the study of dynamic interceptive actions.
Stone, J A; Panchuk, D; Davids, K; North, J S; Fairweather, I; Maynard, I W
2014-12-01
Dynamic interceptive actions, such as catching or hitting a ball, are important task vehicles for investigating the complex relationship between cognition, perception, and action in performance environments. Representative experimental designs have become more important recently, highlighting the need for research methods to ensure that the coupling of information and movement is faithfully maintained. However, retaining representative design while ensuring systematic control of experimental variables is challenging, due to the traditional tendency to employ methods that typically involve use of reductionist motor responses such as buttonpressing or micromovements. Here, we outline the methodology behind a custom-built, integrated ball projection technology that allows images of advanced visual information to be synchronized with ball projection. This integrated technology supports the controlled presentation of visual information to participants while they perform dynamic interceptive actions. We discuss theoretical ideas behind the integration of hardware and software, along with practical issues resolved in technological design, and emphasize how the system can be integrated with emerging developments such as mixed reality environments. We conclude by considering future developments and applications of the integrated projection technology for research in human movement behaviors.
Patel, Disha; Bauman, Joseph D; Arnold, Eddy
2014-01-01
X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this "chemical interrogation" of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. Copyright © 2014. Published by Elsevier Ltd.
Coexisting multiple attractors and riddled basins of a memristive system.
Wang, Guangyi; Yuan, Fang; Chen, Guanrong; Zhang, Yu
2018-01-01
In this paper, a new memristor-based chaotic system is designed, analyzed, and implemented. Multistability, multiple attractors, and complex riddled basins are observed from the system, which are investigated along with other dynamical behaviors such as equilibrium points and their stabilities, symmetrical bifurcation diagrams, and sustained chaotic states. With different sets of system parameters, the system can also generate various multi-scroll attractors. Finally, the system is realized by experimental circuits.
Improved model for the angular dependence of excimer laser ablation rates in polymer materials
NASA Astrophysics Data System (ADS)
Pedder, J. E. A.; Holmes, A. S.; Dyer, P. E.
2009-10-01
Measurements of the angle-dependent ablation rates of polymers that have applications in microdevice fabrication are reported. A simple model based on Beer's law, including plume absorption, is shown to give good agreement with the experimental findings for polycarbonate and SU8, ablated using the 193 and 248 nm excimer lasers, respectively. The modeling forms a useful tool for designing masks needed to fabricate complex surface relief by ablation.
Computational Design of Self-Assembling Cyclic Protein Homo-oligomers
Fallas, Jorge A.; Ueda, George; Sheffler, William; Nguyen, Vanessa; McNamara, Dan E.; Sankaran, Banumathi; Pereira, Jose Henrique; Parmeggiani, Fabio; Brunette, TJ; Cascio, Duilio; Yeates, Todd R.; Zwart, Peter; Baker, David
2016-01-01
Self-assembling cyclic protein homo-oligomers play important roles in biology and the ability to generate custom homo-oligomeric structures could enable new approaches to probe biological function. Here we report a general approach to design cyclic homo-oligomers that employs a new residue pair transform method for assessing the design ability of a protein-protein interface. This method is sufficiently rapid to enable systematic enumeration of cyclically docked arrangements of a monomer followed by sequence design of the newly formed interfaces. We use this method to design interfaces onto idealized repeat proteins that direct their assembly into complexes that possess cyclic symmetry. Of 96 designs that were experimentally characterized, 21 were found to form stable monodisperse homo-oligomers in solution, and 15 (4 homodimers, 6 homotrimers, 6 homotetramers and 1 homopentamer) had solution small angle X-ray scattering data consistent with the design models. X-ray crystal structures were obtained for five of the designs and each of these were shown to be very close to their design model. PMID:28338692
What do we mean by Human-Centered Design of Life-Critical Systems?
Boy, Guy A
2012-01-01
Human-centered design is not a new approach to design. Aerospace is a good example of a life-critical systems domain where participatory design was fully integrated, involving experimental test pilots and design engineers as well as many other actors of the aerospace engineering community. This paper provides six topics that are currently part of the requirements of the Ph.D. Program in Human-Centered Design of the Florida Institute of Technology (FIT.) This Human-Centered Design program offers principles, methods and tools that support human-centered sustainable products such as mission or process control environments, cockpits and hospital operating rooms. It supports education and training of design thinkers who are natural leaders, and understand complex relationships among technology, organizations and people. We all need to understand what we want to do with technology, how we should organize ourselves to a better life and finally find out whom we are and have become. Human-centered design is being developed for all these reasons and issues.
Evolutionary multiobjective design of a flexible caudal fin for robotic fish.
Clark, Anthony J; Tan, Xiaobo; McKinley, Philip K
2015-11-25
Robotic fish accomplish swimming by deforming their bodies or other fin-like appendages. As an emerging class of embedded computing system, robotic fish are anticipated to play an important role in environmental monitoring, inspection of underwater structures, tracking of hazardous wastes and oil spills, and the study of live fish behaviors. While integration of flexible materials (into the fins and/or body) holds the promise of improved swimming performance (in terms of both speed and maneuverability) for these robots, such components also introduce significant design challenges due to the complex material mechanics and hydrodynamic interactions. The problem is further exacerbated by the need for the robots to meet multiple objectives (e.g., both speed and energy efficiency). In this paper, we propose an evolutionary multiobjective optimization approach to the design and control of a robotic fish with a flexible caudal fin. Specifically, we use the NSGA-II algorithm to investigate morphological and control parameter values that optimize swimming speed and power usage. Several evolved fin designs are validated experimentally with a small robotic fish, where fins of different stiffness values and sizes are printed with a multi-material 3D printer. Experimental results confirm the effectiveness of the proposed design approach in balancing the two competing objectives.
A numerical study of mixing in supersonic combustors with hypermixing injectors
NASA Technical Reports Server (NTRS)
Lee, J.
1993-01-01
A numerical study was conducted to evaluate the performance of wall mounted fuel-injectors designed for potential Supersonic Combustion Ramjet (SCRAM-jet) engine applications. The focus of this investigation was to numerically simulate existing combustor designs for the purpose of validating the numerical technique and the physical models developed. Three different injector designs of varying complexity were studied to fully understand the computational implications involved in accurate predictions. A dual transverse injection system and two streamwise injector designs were studied. The streamwise injectors were designed with swept ramps to enhance fuel-air mixing and combustion characteristics at supersonic speeds without the large flow blockage and drag contribution of the transverse injection system. For this study, the Mass-Average Navier-Stokes equations and the chemical species continuity equations were solved. The computations were performed using a finite-volume implicit numerical technique and multiple block structured grid system. The interfaces of the multiple block structured grid systems were numerically resolved using the flux-conservative technique. Detailed comparisons between the computations and existing experimental data are presented. These comparisons show that numerical predictions are in agreement with the experimental data. These comparisons also show that a number of turbulence model improvements are needed for accurate combustor flowfield predictions.
A numerical study of mixing in supersonic combustors with hypermixing injectors
NASA Technical Reports Server (NTRS)
Lee, J.
1992-01-01
A numerical study was conducted to evaluate the performance of wall mounted fuel-injectors designed for potential Supersonic Combustion Ramjet (SCRAM-jet) engine applications. The focus of this investigation was to numerically simulate existing combustor designs for the purpose of validating the numerical technique and the physical models developed. Three different injector designs of varying complexity were studied to fully understand the computational implications involved in accurate predictions. A dual transverse injection system and two streamwise injector designs were studied. The streamwise injectors were designed with swept ramps to enhance fuel-air mixing and combustion characteristics at supersonic speeds without the large flow blockage and drag contribution of the transverse injection system. For this study, the Mass-Averaged Navier-Stokes equations and the chemical species continuity equations were solved. The computations were performed using a finite-volume implicit numerical technique and multiple block structured grid system. The interfaces of the multiple block structured grid systems were numerically resolved using the flux-conservative technique. Detailed comparisons between the computations and existing experimental data are presented. These comparisons show that numerical predictions are in agreement with the experimental data. These comparisons also show that a number of turbulence model improvements are needed for accurate combustor flowfield predictions.
Commercial turbofan engine exhaust nozzle flow analyses using PAB3D
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.; Uenishi, K.; Carlson, John R.; Keith, B. D.
1992-01-01
Recent developments of a three-dimensional (PAB3D) code have paved the way for a computational investigation of complex aircraft aerodynamic components. The PAB3D code was developed for solving the simplified Reynolds Averaged Navier-Stokes equations in a three-dimensional multiblock/multizone structured mesh domain. The present analysis was applied to commercial turbofan exhaust flow systems. Solution sensitivity to grid density is presented. Laminar flow solutions were developed for all grids and two-equation k-epsilon solutions were developed for selected grids. Static pressure distributions, mass flow and thrust quantities were calculated for on-design engine operating conditions. Good agreement between predicted surface static pressures and experimental data was observed at different locations. Mass flow was predicted within 0.2 percent of experimental data. Thrust forces were typically within 0.4 percent of experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
Experimental analysis of Nd-YAG laser cutting of sheet materials - A review
NASA Astrophysics Data System (ADS)
Sharma, Amit; Yadava, Vinod
2018-01-01
Cutting of sheet material is considered as an important process due to its relevance among products of everyday life such as aircrafts, ships, cars, furniture etc. Among various sheet cutting processes (ASCPs), laser beam cutting is one of the most capable ASCP to create complex geometries with stringent design requirements in difficult-to-cut sheet materials. Based on the recent research work in the area of sheet cutting, it is found that the Nd-YAG laser is used for cutting of sheet material in general and reflective sheet material in particular. This paper reviews the experimental analysis of Nd-YAG laser cutting process, carried out to study the influence of laser cutting parameters on the process performance index. The significance of experimental modeling and different optimization approaches employed by various researchers has also been discussed in this study.
Experimental investigation of a four-qubit linear-optical quantum logic circuit
NASA Astrophysics Data System (ADS)
Stárek, R.; Mičuda, M.; Miková, M.; Straka, I.; Dušek, M.; Ježek, M.; Fiurášek, J.
2016-09-01
We experimentally demonstrate and characterize a four-qubit linear-optical quantum logic circuit. Our robust and versatile scheme exploits encoding of two qubits into polarization and path degrees of single photons and involves two crossed inherently stable interferometers. This approach allows us to design a complex quantum logic circuit that combines a genuine four-qubit C3Z gate and several two-qubit and single-qubit gates. The C3Z gate introduces a sign flip if and only if all four qubits are in the computational state |1>. We verify high-fidelity performance of this central four-qubit gate using Hofmann bounds on quantum gate fidelity and Monte Carlo fidelity sampling. We also experimentally demonstrate that the quantum logic circuit can generate genuine multipartite entanglement and we certify the entanglement with the use of suitably tailored entanglement witnesses.
Surface Traps in Colloidal Quantum Dots: A Combined Experimental and Theoretical Perspective.
Giansante, Carlo; Infante, Ivan
2017-10-19
Surface traps are ubiquitous to nanoscopic semiconductor materials. Understanding their atomistic origin and manipulating them chemically have capital importance to design defect-free colloidal quantum dots and make a leap forward in the development of efficient optoelectronic devices. Recent advances in computing power established computational chemistry as a powerful tool to describe accurately complex chemical species and nowadays it became conceivable to model colloidal quantum dots with realistic sizes and shapes. In this Perspective, we combine the knowledge gathered in recent experimental findings with the computation of quantum dot electronic structures. We analyze three different systems: namely, CdSe, PbS, and CsPbI 3 as benchmark semiconductor nanocrystals showing how different types of trap states can form at their surface. In addition, we suggest experimental healing of such traps according to their chemical origin and nanocrystal composition.
Experimental investigation of a four-qubit linear-optical quantum logic circuit.
Stárek, R; Mičuda, M; Miková, M; Straka, I; Dušek, M; Ježek, M; Fiurášek, J
2016-09-20
We experimentally demonstrate and characterize a four-qubit linear-optical quantum logic circuit. Our robust and versatile scheme exploits encoding of two qubits into polarization and path degrees of single photons and involves two crossed inherently stable interferometers. This approach allows us to design a complex quantum logic circuit that combines a genuine four-qubit C(3)Z gate and several two-qubit and single-qubit gates. The C(3)Z gate introduces a sign flip if and only if all four qubits are in the computational state |1〉. We verify high-fidelity performance of this central four-qubit gate using Hofmann bounds on quantum gate fidelity and Monte Carlo fidelity sampling. We also experimentally demonstrate that the quantum logic circuit can generate genuine multipartite entanglement and we certify the entanglement with the use of suitably tailored entanglement witnesses.
Quantitative analyses of bifunctional molecules.
Braun, Patrick D; Wandless, Thomas J
2004-05-11
Small molecules can be discovered or engineered to bind tightly to biologically relevant proteins, and these molecules have proven to be powerful tools for both basic research and therapeutic applications. In many cases, detailed biophysical analyses of the intermolecular binding events are essential for improving the activity of the small molecules. These interactions can often be characterized as straightforward bimolecular binding events, and a variety of experimental and analytical techniques have been developed and refined to facilitate these analyses. Several investigators have recently synthesized heterodimeric molecules that are designed to bind simultaneously with two different proteins to form ternary complexes. These heterodimeric molecules often display compelling biological activity; however, they are difficult to characterize. The bimolecular interaction between one protein and the heterodimeric ligand (primary dissociation constant) can be determined by a number of methods. However, the interaction between that protein-ligand complex and the second protein (secondary dissociation constant) is more difficult to measure due to the noncovalent nature of the original protein-ligand complex. Consequently, these heterodimeric compounds are often characterized in terms of their activity, which is an experimentally dependent metric. We have developed a general quantitative mathematical model that can be used to measure both the primary (protein + ligand) and secondary (protein-ligand + protein) dissociation constants for heterodimeric small molecules. These values are largely independent of the experimental technique used and furthermore provide a direct measure of the thermodynamic stability of the ternary complexes that are formed. Fluorescence polarization and this model were used to characterize the heterodimeric molecule, SLFpYEEI, which binds to both FKBP12 and the Fyn SH2 domain, demonstrating that the model is useful for both predictive as well as ex post facto analytical applications.
Laser ion source for heavy ion inertial fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okamura, Masahiro
The proposed heavy ion inertial fusion (HIF) scenarios require ampere class low charge state ion beams of heavy species. A laser ion source (LIS) is recognized as one of the promising candidates of ion beam providers, since it can deliver high brightness heavy ion beams to accelerators. A design of LIS for the HIF depends on the accelerator structure and accelerator complex following the source. In this article, we discuss the specifications and design of an appropriate LIS assuming two major types of the accelerators: radio frequency (RF) high quality factor cavity type and non-resonant induction core type. We believemore » that a properly designed LIS satisfies the requirements of both types, however some issues need to be verified experimentally.« less
Simple and inexpensive microfluidic devices for the generation of monodisperse multiple emulsions
NASA Astrophysics Data System (ADS)
Li, Er Qiang; Zhang, Jia Ming; Thoroddsen, Sigurdur T.
2014-01-01
Droplet-based microfluidic devices have become a preferred versatile platform for various fields in physics, chemistry and biology. Polydimethylsiloxane soft lithography, the mainstay for fabricating microfluidic devices, usually requires the usage of expensive apparatus and a complex manufacturing procedure. Here, we report the design and fabrication of simple and inexpensive microfluidic devices based on microscope glass slides and pulled glass capillaries, for generating monodisperse multiple emulsions. The advantages of our method lie in a simple manufacturing procedure, inexpensive processing equipment and flexibility in the surface modification of the designed microfluidic devices. Different types of devices have been designed and tested and the experimental results demonstrated their robustness for preparing monodisperse single, double, triple and multi-component emulsions.
Laser ion source for heavy ion inertial fusion
Okamura, Masahiro
2018-01-10
The proposed heavy ion inertial fusion (HIF) scenarios require ampere class low charge state ion beams of heavy species. A laser ion source (LIS) is recognized as one of the promising candidates of ion beam providers, since it can deliver high brightness heavy ion beams to accelerators. A design of LIS for the HIF depends on the accelerator structure and accelerator complex following the source. In this article, we discuss the specifications and design of an appropriate LIS assuming two major types of the accelerators: radio frequency (RF) high quality factor cavity type and non-resonant induction core type. We believemore » that a properly designed LIS satisfies the requirements of both types, however some issues need to be verified experimentally.« less
NASA Astrophysics Data System (ADS)
Degenhardt, Richard
2014-06-01
Space industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite space and aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis. Currently, the potential of composite light weight structures, which are prone to buckling, is not fully exploited as appropriate guidelines in the field of space applications do not exist. This paper deals with the state-of-the-art advances and challenges related to coupled stability analysis of composite structures which show very complex stability behaviour. Improved design guidelines for composites structures are still under development. This paper gives a short state-of-the-art and presents a proposal for a future design guideline.
Design and performance frameworks for constructing problem-solving simulations.
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement.
Design and Performance Frameworks for Constructing Problem-Solving Simulations
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement. PMID:14506505
Moorthy, Arun S; Eberl, Hermann J
2014-04-01
Fermentation reactor systems are a key platform in studying intestinal microflora, specifically with respect to questions surrounding the effects of diet. In this study, we develop computational representations of colon fermentation reactor systems as a way to assess the influence of three design elements (number of reactors, emptying mechanism, and inclusion of microbial immobilization) on three performance measures (total biomass density, biomass composition, and fibre digestion efficiency) using a fractional-factorial experimental design. It was determined that the choice of emptying mechanism showed no effect on any of the performance measures. Additionally, it was determined that none of the design criteria had any measurable effect on reactor performance with respect to biomass composition. It is recommended that model fermentation systems used in the experimenting of dietary effects on intestinal biomass composition be streamlined to only include necessary system design complexities, as the measured performance is not benefited by the addition of microbial immobilization mechanisms or semi-continuous emptying scheme. Additionally, the added complexities significantly increase computational time during simulation experiments. It was also noted that the same factorial experiment could be directly adapted using in vitro colon fermentation systems. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Ikeda-like chaos on a dynamically filtered supercontinuum light source
NASA Astrophysics Data System (ADS)
Chembo, Yanne K.; Jacquot, Maxime; Dudley, John M.; Larger, Laurent
2016-08-01
We demonstrate temporal chaos in a color-selection mechanism from the visible spectrum of a supercontinuum light source. The color-selection mechanism is governed by an acousto-optoelectronic nonlinear delayed-feedback scheme modeled by an Ikeda-like equation. Initially motivated by the design of a broad audience live demonstrator in the framework of the International Year of Light 2015, the setup also provides a different experimental tool to investigate the dynamical complexity of delayed-feedback dynamics. Deterministic hyperchaos is analyzed here from the experimental time series. A projection method identifies the delay parameter, for which the chaotic strange attractor originally evolving in an infinite-dimensional phase space can be revealed in a two-dimensional subspace.
Stochastic estimation of human shoulder impedance with robots: an experimental design.
Park, Kyungbin; Chang, Pyung Hun
2011-01-01
Previous studies assumed the shoulder as a hinge joint during human arm impedance measurement. This is obviously a vast simplification since the shoulder is a complex of several joints with multiple degrees of freedom. In the present work, a practical methodology for more general and realistic estimation of human shoulder impedance is proposed and validated with a spring array. It includes a gravity compensation scheme, which is developed and used for the experiments with a spatial three degrees of freedom PUMA-type robot. The experimental results were accurate and reliable, and thus it has shown a strong potential of the proposed methodology in the estimation of human shoulder impedance. © 2011 IEEE
Improvement of heat transfer by means of ultrasound: Application to a double-tube heat exchanger.
Legay, M; Simony, B; Boldo, P; Gondrexon, N; Le Person, S; Bontemps, A
2012-11-01
A new kind of ultrasonically-assisted heat exchanger has been designed, built and studied. It can be seen as a vibrating heat exchanger. A comprehensive description of the overall experimental set-up is provided, i.e. of the test rig and the acquisition system. Data acquisition and processing are explained step-by-step with a detailed example of graph obtained and how, from these experimental data, energy balance is calculated on the heat exchanger. It is demonstrated that ultrasound can be used efficiently as a heat transfer enhancement technique, even in such complex systems as heat exchangers. Copyright © 2012 Elsevier B.V. All rights reserved.
Simple method for experimentally testing any form of quantum contextuality
NASA Astrophysics Data System (ADS)
Cabello, Adán
2016-03-01
Contextuality provides a unifying paradigm for nonclassical aspects of quantum probabilities and resources of quantum information. Unfortunately, most forms of quantum contextuality remain experimentally unexplored due to the difficulty of performing sequences of projective measurements on individual quantum systems. Here we show that two-point correlations between binary compatible observables are sufficient to reveal any form of contextuality. This allows us to design simple experiments that are more robust against imperfections and easier to analyze, thus opening the door for observing interesting forms of contextuality, including those requiring quantum systems of high dimensions. In addition, it allows us to connect contextuality to communication complexity scenarios and reformulate a recent result relating contextuality and quantum computation.
Multivariate analysis techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bendavid, Josh; Fisher, Wade C.; Junk, Thomas R.
2016-01-01
The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually bothmore » be improved by separating signal events from background events with higher efficiency and purity.« less
Open-Source 3D-Printable Optics Equipment
Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104
Open-source 3D-printable optics equipment.
Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.
Envelope: interactive software for modeling and fitting complex isotope distributions.
Sykes, Michael T; Williamson, James R
2008-10-20
An important aspect of proteomic mass spectrometry involves quantifying and interpreting the isotope distributions arising from mixtures of macromolecules with different isotope labeling patterns. These patterns can be quite complex, in particular with in vivo metabolic labeling experiments producing fractional atomic labeling or fractional residue labeling of peptides or other macromolecules. In general, it can be difficult to distinguish the contributions of species with different labeling patterns to an experimental spectrum and difficult to calculate a theoretical isotope distribution to fit such data. There is a need for interactive and user-friendly software that can calculate and fit the entire isotope distribution of a complex mixture while comparing these calculations with experimental data and extracting the contributions from the differently labeled species. Envelope has been developed to be user-friendly while still being as flexible and powerful as possible. Envelope can simultaneously calculate the isotope distributions for any number of different labeling patterns for a given peptide or oligonucleotide, while automatically summing these into a single overall isotope distribution. Envelope can handle fractional or complete atom or residue-based labeling, and the contribution from each different user-defined labeling pattern is clearly illustrated in the interactive display and is individually adjustable. At present, Envelope supports labeling with 2H, 13C, and 15N, and supports adjustments for baseline correction, an instrument accuracy offset in the m/z domain, and peak width. Furthermore, Envelope can display experimental data superimposed on calculated isotope distributions, and calculate a least-squares goodness of fit between the two. All of this information is displayed on the screen in a single graphical user interface. Envelope supports high-quality output of experimental and calculated distributions in PNG or PDF format. Beyond simply comparing calculated distributions to experimental data, Envelope is useful for planning or designing metabolic labeling experiments, by visualizing hypothetical isotope distributions in order to evaluate the feasibility of a labeling strategy. Envelope is also useful as a teaching tool, with its real-time display capabilities providing a straightforward way to illustrate the key variable factors that contribute to an observed isotope distribution. Envelope is a powerful tool for the interactive calculation and visualization of complex isotope distributions for comparison to experimental data. It is available under the GNU General Public License from http://williamson.scripps.edu/envelope/.
Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven
2018-01-16
Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.
Effect of β-Cyclodextrin Complexation on Solubility and Enzymatic Conversion of Naringin
Cui, Li; Zhang, Zhen-Hai; Sun, E; Jia, Xiao-Bin
2012-01-01
In the present paper, the effect of β-cyclodextrin (β-CD) inclusion complexation on the solubility and enzymatic hydrolysis of naringin was investigated. The inclusion complex of naringin/β-CD at the molar ratio of 1:1 was obtained by the dropping method and was characterized by differential scanning calorimetry. The solubility of naringin complexes in water at 37 ± 0.1 °C was 15 times greater than that of free naringin. Snailase-involved hydrolysis conditions were tested for the bioconversion of naringin into naringenin using the univariate experimental design. Naringin can be transformed into naringenin by snailase-involved hydrolysis. The optimum conditions for enzymatic hydrolysis were determined as follows: pH 5.0, temperature 37 °C, ratio of snailase/substrate 0.8, substrate concentration 20 mg·mL−1, and reaction time 12 h. Under the optimum conditions, the transforming rate of naringenin from naringin for inclusion complexes and free naringin was 98.7% and 56.2% respectively, suggesting that β-CD complexation can improve the aqueous solubility and consequently the enzymatic hydrolysis rate of naringin. PMID:23203062
Extracting physics through deep data analysis
Strelcov, Evgheni; Belianinov, Alex; Sumpter, Bobby G.; ...
2014-10-31
In recent decades humankind has become very apt at generating and recording enormous amounts of data, ranging from tweets and selfies on social networks, to financial transactions in banks and stores. The scientific community has not shunned this popular trend and now routinely produces hundreds of petabytes of data per year [1]. This is because materials and phenomena in the world around us exist in an interweaved, entangled form, which gives rise to the complexity of the Universe and determines the size and complexity of the data that describes it. Science and technology endeavor to unravel this convolution and extractmore » pure components from the mixtures, be it in ore mining and metal smelting or separation of thermal conductivity into the electronic and phononic contributions. Decomposition of complex behavior is the key to understanding manifestations of Nature. However, tools to carry out this task are not readily available, and therefore, intricate systems often remain well-characterized experimentally, but still not well understood due to intricacy of the collected data. Lastly, in materials science, understanding and ultimately designing new materials with complex properties will require the ability to integrate and analyze data from multiple instruments, including computational models, designed to probe complementary ranges of space, time, and energy.« less
Tlatli, Rym; Nozach, Hervé; Collet, Guillaume; Beau, Fabrice; Vera, Laura; Stura, Enrico; Dive, Vincent; Cuniasse, Philippe
2013-01-01
Artificial miniproteins that are able to target catalytic sites of matrix metalloproteinases (MMPs) were designed using a functional motif-grafting approach. The motif corresponded to the four N-terminal residues of TIMP-2, a broad-spectrum protein inhibitor of MMPs. Scaffolds that are able to reproduce the functional topology of this motif were obtained by exhaustive screening of the Protein Data Bank (PDB) using STAMPS software (search for three-dimensional atom motifs in protein structures). Ten artificial protein binders were produced. The designed proteins bind catalytic sites of MMPs with affinities ranging from 450 nm to 450 μm prior to optimization. The crystal structure of one artificial binder in complex with the catalytic domain of MMP-12 showed that the inter-molecular interactions established by the functional motif in the artificial binder corresponded to those found in the MMP-14-TIMP-2 complex, albeit with some differences in geometry. Molecular dynamics simulations of the ten binders in complex with MMP-14 suggested that these scaffolds may allow partial reproduction of native inter-molecular interactions, but differences in geometry and stability may contribute to the lower affinity of the artificial protein binders compared to the natural protein binder. Nevertheless, these results show that the in silico design method used provides sets of protein binders that target a specific binding site with a good rate of success. This approach may constitute the first step of an efficient hybrid computational/experimental approach to protein binder design. © 2012 The Authors Journal compilation © 2012 FEBS.
NASA Astrophysics Data System (ADS)
Bellini, Anna
Customer-driven product customization and continued demand for cost and time savings have generated a renewed interest in agile manufacturing based on improvements on Rapid Prototyping (RP) technologies. The advantages of RP technologies are: (1) ability to shorten the product design and development time, (2) suitability for automation and decrease in the level of human intervention, (3) ability to build many geometrically complex shapes. A shift from "prototyping" to "manufacturing" necessitates the following improvements: (1) Flexibility in choice of materials; (2) Part integrity and built-in characteristics to meet performance requirements; (3) Dimensional stability and tolerances; (4) Improved surface finish. A project funded by ONR has been undertaken to develop an agile manufacturing technology for fabrication of ceramic and multi-component parts to meet various needs of the Navy, such as transducers, etc. The project is based on adaptation of a layered manufacturing concept since the program required that the new technology be developed based on a commercially available RP technology. Among various RP technologies available today, Fused Deposition Modeling (FDM) has been identified as the focus of this research because of its potential versatility in the choice of materials and deposition configuration. This innovative approach allows for designing and implementing highly complex internal architectures into parts through deposition of different materials in a variety of configurations in such a way that the finished product exhibit characteristics to meet the performance requirements. This implies that, in principle, one can tailor-make the assemble of materials and structures as per specifications of an optimum design. The program objectives can be achieved only through accurate process modeling and modeling of material behavior. Oftentimes, process modeling is based on some type of computational approach where as modeling of material behavior is based on extensive experimental investigations. Studies are conducted in the following categories: (1) Flow modeling during extrusion and deposition; (2) Thermal modeling; (3) Flow control during deposition; (4) Product characterization and property determination for dimensional analysis; (5) Development of a novel technology based on a mini-extrusion system. Studies in each of these stages have involved experimental as well as analytical approaches to develop a comprehensive modeling.
NASA Astrophysics Data System (ADS)
Zobnina, V. G.; Kosevich, M. V.; Chagovets, V. V.; Boryak, O. A.
A problem of elucidation of structure of nanomaterials based on combination of proteins and polyether polymers is addressed on the monomeric level of single amino acids and oligomers of PEG-400 and OEG-5 polyethers. Efficiency of application of combined approach involving experimental electrospray mass spectrometry and computer modeling by molecular dynamics simulation is demonstrated. It is shown that oligomers of polyethers form stable complexes with amino acids valine, proline, histidine, glutamic, and aspartic acids. Molecular dynamics simulation has shown that stabilization of amino acid-polyether complexes is achieved due to winding of the polymeric chain around charged groups of amino acids. Structural motives revealed for complexes of single amino acids with polyethers can be realized in structures of protein-polyether nanoparticles currently designed for drug delivery.
Adjoint equations and analysis of complex systems: Application to virus infection modelling
NASA Astrophysics Data System (ADS)
Marchuk, G. I.; Shutyaev, V.; Bocharov, G.
2005-12-01
Recent development of applied mathematics is characterized by ever increasing attempts to apply the modelling and computational approaches across various areas of the life sciences. The need for a rigorous analysis of the complex system dynamics in immunology has been recognized since more than three decades ago. The aim of the present paper is to draw attention to the method of adjoint equations. The methodology enables to obtain information about physical processes and examine the sensitivity of complex dynamical systems. This provides a basis for a better understanding of the causal relationships between the immune system's performance and its parameters and helps to improve the experimental design in the solution of applied problems. We show how the adjoint equations can be used to explain the changes in hepatitis B virus infection dynamics between individual patients.
Transformations of software design and code may lead to reduced errors
NASA Technical Reports Server (NTRS)
Connelly, E. M.
1983-01-01
The capability of programmers and non-programmers to specify problem solutions by developing example-solutions and also for the programmers by writing computer programs was investigated; each method of specification was accomplished at various levels of problem complexity. The level of difficulty of each problem was reflected by the number of steps needed by the user to develop a solution. Machine processing of the user inputs permitted inferences to be developed about the algorithms required to solve a particular problem. The interactive feedback of processing results led users to a more precise definition of the desired solution. Two participant groups (programmers and bookkeepers/accountants) working with three levels of problem complexity and three levels of processor complexity were used. The experimental task employed required specification of a logic for solution of a Navy task force problem.
Digital Signal Processing and Control for the Study of Gene Networks
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun
2016-04-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks.
Shin, Yong-Jun
2016-04-22
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks
Shin, Yong-Jun
2016-01-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, R.G.; Siergiej, J.M.
1962-12-28
In a program to develop a complete manufacturing process for ihe production of beryllium channels, techniques are being sought for drawing to obtain a flnal product meeting specifications more rigorous than are obtainable by direct extrusion. Progress in designing and procuring the special tooling required to draw complex shapes at elevated temperature is described, and the flrst set of draw dies is evaluated with respect to design and quality. Three experimental draw attempts have been made on U-channels, in addition to draw tests on flats. (auth)
Computation-Guided Backbone Grafting of a Discontinuous Motif onto a Protein Scaffold
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azoitei, Mihai L.; Correia, Bruno E.; Ban, Yih-En Andrew
2012-02-07
The manipulation of protein backbone structure to control interaction and function is a challenge for protein engineering. We integrated computational design with experimental selection for grafting the backbone and side chains of a two-segment HIV gp120 epitope, targeted by the cross-neutralizing antibody b12, onto an unrelated scaffold protein. The final scaffolds bound b12 with high specificity and with affinity similar to that of gp120, and crystallographic analysis of a scaffold bound to b12 revealed high structural mimicry of the gp120-b12 complex structure. The method can be generalized to design other functional proteins through backbone grafting.
Curvature by design and on demand in liquid crystal elastomers
NASA Astrophysics Data System (ADS)
Kowalski, B. A.; Mostajeran, C.; Godman, N. P.; Warner, M.; White, T. J.
2018-01-01
The shape of liquid crystalline elastomers (LCEs) with spatial variation in the director orientation can be transformed by exposure to a stimulus. Here, informed by previously reported analytical treatments, we prepare complex spiral patterns imprinted into LCEs and quantify the resulting shape transformation. Quantification of the stimuli-induced shapes reveals good agreement between predicted and experimentally observed curvatures. We conclude this communication by reporting a design strategy to allow LCE films to be anchored at their external boundaries onto rigid substrates without incurring internal, mechanical-mismatch stresses upon actuation, a critical advance to the realization of shape transformation of LCEs in practical device applications.
Design and Analysis of Single-Cell Sequencing Experiments.
Grün, Dominic; van Oudenaarden, Alexander
2015-11-05
Recent advances in single-cell sequencing hold great potential for exploring biological systems with unprecedented resolution. Sequencing the genome of individual cells can reveal somatic mutations and allows the investigation of clonal dynamics. Single-cell transcriptome sequencing can elucidate the cell type composition of a sample. However, single-cell sequencing comes with major technical challenges and yields complex data output. In this Primer, we provide an overview of available methods and discuss experimental design and single-cell data analysis. We hope that these guidelines will enable a growing number of researchers to leverage the power of single-cell sequencing. Copyright © 2015 Elsevier Inc. All rights reserved.