ERIC Educational Resources Information Center
Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju
2014-01-01
The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…
Experimental design methods for bioengineering applications.
Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri
2016-01-01
Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.
Providing Guidance in Virtual Lab Experimentation: The Case of an Experiment Design Tool
ERIC Educational Resources Information Center
Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; deJong, Ton; Anjewierden, Anjo; van Riesen, Siswa A. N.
2018-01-01
The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students' cognitive processes and inquiry skills before and after…
RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS
The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
Adaptive design of visual perception experiments
NASA Astrophysics Data System (ADS)
O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja
2010-04-01
Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.
van Oostrom, Conny T.; Jonker, Martijs J.; de Jong, Mark; Dekker, Rob J.; Rauwerda, Han; Ensink, Wim A.; de Vries, Annemieke; Breit, Timo M.
2014-01-01
In transcriptomics research, design for experimentation by carefully considering biological, technological, practical and statistical aspects is very important, because the experimental design space is essentially limitless. Usually, the ranges of variable biological parameters of the design space are based on common practices and in turn on phenotypic endpoints. However, specific sub-cellular processes might only be partially reflected by phenotypic endpoints or outside the associated parameter range. Here, we provide a generic protocol for range finding in design for transcriptomics experimentation based on small-scale gene-expression experiments to help in the search for the right location in the design space by analyzing the activity of already known genes of relevant molecular mechanisms. Two examples illustrate the applicability: in-vitro UV-C exposure of mouse embryonic fibroblasts and in-vivo UV-B exposure of mouse skin. Our pragmatic approach is based on: framing a specific biological question and associated gene-set, performing a wide-ranged experiment without replication, eliminating potentially non-relevant genes, and determining the experimental ‘sweet spot’ by gene-set enrichment plus dose-response correlation analysis. Examination of many cellular processes that are related to UV response, such as DNA repair and cell-cycle arrest, revealed that basically each cellular (sub-) process is active at its own specific spot(s) in the experimental design space. Hence, the use of range finding, based on an affordable protocol like this, enables researchers to conveniently identify the ‘sweet spot’ for their cellular process of interest in an experimental design space and might have far-reaching implications for experimental standardization. PMID:24823911
How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment
NASA Astrophysics Data System (ADS)
Baker, Lisa M.
While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation bias in earlier studies using science-like tasks, in which characteristics of the alternate hypothesis space may have made it unfeasible for participants to generate and test alternate hypotheses. In general, scientists and science undergraduates were found to engage in a systematic experimental design process that responded to salient features of the problem environment, including the constant potential for experimental error, availability of alternate hypotheses, and access to both theoretical knowledge and knowledge of experimental techniques.
NASA Technical Reports Server (NTRS)
1981-01-01
The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.
ERIC Educational Resources Information Center
d'Ham, Cedric; de Vries, Erica; Girault, Isabelle; Marzin, Patricia
2004-01-01
This paper deals with the design process of a remote laboratory for labwork in chemistry. In particular, it focuses on the mutual dependency of theoretical conjectures about learning in the experimental sciences and technological opportunities in creating learning environments. The design process involves a detailed analysis of the expert task and…
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
Ng, Candy K S; Osuna-Sanchez, Hector; Valéry, Eric; Sørensen, Eva; Bracewell, Daniel G
2012-06-15
An integrated experimental and modeling approach for the design of high productivity protein A chromatography is presented to maximize productivity in bioproduct manufacture. The approach consists of four steps: (1) small-scale experimentation, (2) model parameter estimation, (3) productivity optimization and (4) model validation with process verification. The integrated use of process experimentation and modeling enables fewer experiments to be performed, and thus minimizes the time and materials required in order to gain process understanding, which is of key importance during process development. The application of the approach is demonstrated for the capture of antibody by a novel silica-based high performance protein A adsorbent named AbSolute. In the example, a series of pulse injections and breakthrough experiments were performed to develop a lumped parameter model, which was then used to find the best design that optimizes the productivity of a batch protein A chromatographic process for human IgG capture. An optimum productivity of 2.9 kg L⁻¹ day⁻¹ for a column of 5mm diameter and 8.5 cm length was predicted, and subsequently verified experimentally, completing the whole process design approach in only 75 person-hours (or approximately 2 weeks). Copyright © 2012 Elsevier B.V. All rights reserved.
Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy J.
2016-01-01
Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students’ competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new “experimentation assessments,” 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. PMID:27146159
Teaching Experimental Design to Elementary School Pupils in Greece
ERIC Educational Resources Information Center
Karampelas, Konstantinos
2016-01-01
This research is a study about the possibility to promote experimental design skills to elementary school pupils. Experimental design and the experiment process are foundational elements in current approaches to Science Teaching, as they provide learners with profound understanding about knowledge construction and science inquiry. The research was…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabeel A. Riza
The goals of the first six months of this project were to lay the foundations for both the SiC front-end optical chip fabrication as well as the free-space laser beam interferometer designs and preliminary tests. In addition, a Phase I goal was to design and experimentally build the high temperature and pressure infrastructure and test systems that will be used in the next 6 months for proposed sensor experimentation and data processing. All these goals have been achieved and are described in detail in the report. Both design process and diagrams for the mechanical elements as well as the opticalmore » systems are provided. In addition, photographs of the fabricated SiC optical chips, the high temperature & pressure test chamber instrument, the optical interferometer, the SiC sample chip holder, and signal processing data are provided. The design and experimentation results are summarized to give positive conclusions on the proposed novel high temperature optical sensor technology. The goals of the second six months of this project were to conduct high temperature sensing tests using the test chamber and optical sensing instrument designs developed in the first part of the project. In addition, a Phase I goal was to develop the basic processing theory and physics for the proposed first sensor experimentation and data processing. All these goals have been achieved and are described in detail. Both optical experimental design process and sensed temperature are provided. In addition, photographs of the fabricated SiC optical chips after deployment in the high temperature test chamber are shown from a material study point-of-view.« less
Robust parameter design for automatically controlled systems and nanostructure synthesis
NASA Astrophysics Data System (ADS)
Dasgupta, Tirthankar
2007-12-01
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
Inverse problems in the design, modeling and testing of engineering systems
NASA Technical Reports Server (NTRS)
Alifanov, Oleg M.
1991-01-01
Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design, and Data Management
2014-01-01
Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design , and Data Management Kevin W. Williams1 Bonny Christopher2 Gena...Simulation Studies: Experimental Planning, Design , and Data Management January 2014 6. Performing Organization Code 7. Author(s) 8. Performing...describe the process by which we designed our human-in-the-loop (HITL) simulation study and the methodology used to collect and analyze the results
Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu
2017-06-15
In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
A Surrogate Approach to the Experimental Optimization of Multielement Airfoils
NASA Technical Reports Server (NTRS)
Otto, John C.; Landman, Drew; Patera, Anthony T.
1996-01-01
The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.
Distributed digital signal processors for multi-body structures
NASA Technical Reports Server (NTRS)
Lee, Gordon K.
1990-01-01
Several digital filter designs were investigated which may be used to process sensor data from large space structures and to design digital hardware to implement the distributed signal processing architecture. Several experimental tests articles are available at NASA Langley Research Center to evaluate these designs. A summary of some of the digital filter designs is presented, an evaluation of their characteristics relative to control design is discussed, and candidate hardware microcontroller/microcomputer components are given. Future activities include software evaluation of the digital filter designs and actual hardware inplementation of some of the signal processor algorithms on an experimental testbed at NASA Langley.
NASA Technical Reports Server (NTRS)
1980-01-01
The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.
The Experimental Design Assistant.
Percie du Sert, Nathalie; Bamsey, Ian; Bate, Simon T; Berdoy, Manuel; Clark, Robin A; Cuthill, Innes; Fry, Derek; Karp, Natasha A; Macleod, Malcolm; Moon, Lawrence; Stanford, S Clare; Lings, Brian
2017-09-01
Addressing the common problems that researchers encounter when designing and analysing animal experiments will improve the reliability of in vivo research. In this article, the Experimental Design Assistant (EDA) is introduced. The EDA is a web-based tool that guides the in vivo researcher through the experimental design and analysis process, providing automated feedback on the proposed design and generating a graphical summary that aids communication with colleagues, funders, regulatory authorities, and the wider scientific community. It will have an important role in addressing causes of irreproducibility.
The Experimental Design Assistant
Bamsey, Ian; Bate, Simon T.; Berdoy, Manuel; Clark, Robin A.; Cuthill, Innes; Fry, Derek; Karp, Natasha A.; Macleod, Malcolm; Moon, Lawrence; Stanford, S. Clare; Lings, Brian
2017-01-01
Addressing the common problems that researchers encounter when designing and analysing animal experiments will improve the reliability of in vivo research. In this article, the Experimental Design Assistant (EDA) is introduced. The EDA is a web-based tool that guides the in vivo researcher through the experimental design and analysis process, providing automated feedback on the proposed design and generating a graphical summary that aids communication with colleagues, funders, regulatory authorities, and the wider scientific community. It will have an important role in addressing causes of irreproducibility. PMID:28957312
The Effect of Multispectral Image Fusion Enhancement on Human Efficiency
2017-03-20
performance of the ideal observer is indicative of the relative amount of informa- tion across various experimental manipulations. In our experimental design ...registration and fusion processes, and contributed strongly to the statistical analyses. LMB contributed to the experimental design and writing structure. All... designed to be innovative, low-cost, and (relatively) easy-to-implement, and to provide support across the spectrum of possible users including
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabeel A. Riza
The goals of the first six months of this project were to lay the foundations for both the SiC front-end optical chip fabrication as well as the free-space laser beam interferometer designs and preliminary tests. In addition, a Phase I goal was to design and experimentally build the high temperature and pressure infrastructure and test systems that will be used in the next 6 months for proposed sensor experimentation and data processing. All these goals have been achieved and are described in detail in the report. Both design process and diagrams for the mechanical elements as well as the opticalmore » systems are provided. In addition, photographs of the fabricated SiC optical chips, the high temperature & pressure test chamber instrument, the optical interferometer, the SiC sample chip holder, and signal processing data are provided. The design and experimentation results are summarized to give positive conclusions on the proposed novel high temperature optical sensor technology.« less
1981-01-01
per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern
Conducting Research in Technical Communication: The Application of True Experimental Designs.
ERIC Educational Resources Information Center
Spyridakis, Jan H.
1992-01-01
Explains the use of true experimental designs in technical communication research, lists the eight steps in the research process, and concludes with the hope that practitioners should now be able to read research studies critically and perhaps design empirical studies of their own after further reading. (SR)
Pant, Apourv; Rai, J P N
2018-04-15
Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.
Optimal Experimental Design for Model Discrimination
Myung, Jay I.; Pitt, Mark A.
2009-01-01
Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983
1983-02-01
blow-off stability and fractional conversion was evaluated for design of an experimental study of these phenomena. The apparatus designed will be...the development of an array of experimental methods and test strategies designed to unravel a complex process that is very difficult to observe directly...this effort of lead field theoretic analysis as a design basis has made that possible. The experimental phase of the effort has three major
2016-03-18
SPONSORED REPORT SERIES Understanding Complexity and Self - Organization in a Defense Program Management Organization (Experimental Design...experiment will examine the decision-making process within the program office and the self - organization of key program office personnel based upon formal...and informal communications links. Additionally, we are interested in the effects of this self - organizing process on the organization’s shared
A Reverse Osmosis System for an Advanced Separation Process Laboratory.
ERIC Educational Resources Information Center
Slater, C. S.; Paccione, J. D.
1987-01-01
Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)
Assessing the Effectiveness of a Computer Simulation for Teaching Ecological Experimental Design
ERIC Educational Resources Information Center
Stafford, Richard; Goodenough, Anne E.; Davies, Mark S.
2010-01-01
Designing manipulative ecological experiments is a complex and time-consuming process that is problematic to teach in traditional undergraduate classes. This study investigates the effectiveness of using a computer simulation--the Virtual Rocky Shore (VRS)--to facilitate rapid, student-centred learning of experimental design. We gave a series of…
Space-filling designs for computer experiments: A review
Joseph, V. Roshan
2016-01-29
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Space-filling designs for computer experiments: A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, V. Roshan
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Theoretical and experimental researches of the liquid evaporation during thermal vacuum influences
NASA Astrophysics Data System (ADS)
Trushlyakov, V.; Panichkin, A.; Prusova, O.; Zharikov, K.; Dron, M.
2018-01-01
The mathematical model of the evaporation process of model liquid with the free surface boundary conditions of the "mirror" type under thermal vacuum influence and the numerical estimates of the evaporation process parameters are developed. An experimental stand, comprising a vacuum chamber, an experimental model tank with a heating element is designed; the experimental data are obtained. A comparative analysis of numerical and experimental results showed their close match.
Prediction of Cutting Force in Turning Process-an Experimental Approach
NASA Astrophysics Data System (ADS)
Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.
2018-02-01
This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.
Morales-Pérez, Ariadna A; Maravilla, Pablo; Solís-López, Myriam; Schouwenaars, Rafael; Durán-Moreno, Alfonso; Ramírez-Zamora, Rosa-María
2016-01-01
An experimental design methodology was used to optimize the synthesis of an iron-supported nanocatalyst as well as the inactivation process of Ascaris eggs (Ae) using this material. A factor screening design was used for identifying the significant experimental factors for nanocatalyst support (supported %Fe, (w/w), temperature and time of calcination) and for the inactivation process called the heterogeneous Fenton-like reaction (H2O2 dose, mass ratio Fe/H2O2, pH and reaction time). The optimization of the significant factors was carried out using a face-centered central composite design. The optimal operating conditions for both processes were estimated with a statistical model and implemented experimentally with five replicates. The predicted value of the Ae inactivation rate was close to the laboratory results. At the optimal operating conditions of the nanocatalyst production and Ae inactivation process, the Ascaris ova showed genomic damage to the point that no cell reparation was possible showing that this advanced oxidation process was highly efficient for inactivating this pathogen.
Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M
2009-10-15
A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
RESEARCH DESIGNS IN SPORTS PHYSICAL THERAPY
2012-01-01
Research is designed to answer a question or to describe a phenomenon in a scientific process. Sports physical therapists must understand the different research methods, types, and designs in order to implement evidence‐based practice. The purpose of this article is to describe the most common research designs used in sports physical therapy research and practice. Both experimental and non‐experimental methods will be discussed. PMID:23091780
Development and Evaluation of an Intuitive Operations Planning Process
2006-03-01
designed to be iterative and also prescribes the way in which iterations should occur. On the other hand, participants’ perceived level of trust and...16 4. DESIGN AND METHOD OF THE EXPERIMENTAL EVALUATION OF THE INTUITIVE PLANNING PROCESS...20 4.1.3 Design
Sakkas, Vasilios A; Islam, Md Azharul; Stalikas, Constantine; Albanis, Triantafyllos A
2010-03-15
The use of chemometric methods such as response surface methodology (RSM) based on statistical design of experiments (DOEs) is becoming increasingly widespread in several sciences such as analytical chemistry, engineering and environmental chemistry. Applied catalysis, is certainly not the exception. It is clear that photocatalytic processes mated with chemometric experimental design play a crucial role in the ability of reaching the optimum of the catalytic reactions. The present article reviews the major applications of RSM in modern experimental design combined with photocatalytic degradation processes. Moreover, the theoretical principles and designs that enable to obtain a polynomial regression equation, which expresses the influence of process parameters on the response are thoroughly discussed. An original experimental work, the photocatalytic degradation of the dye Congo red (CR) using TiO(2) suspensions and H(2)O(2), in natural surface water (river water) is comprehensively described as a case study, in order to provide sufficient guidelines to deal with this subject, in a rational and integrated way. (c) 2009 Elsevier B.V. All rights reserved.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
Politis, Stavros N; Rekkas, Dimitrios M
2017-04-01
A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.
Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy J
2016-01-01
Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students' competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new "experimentation assessments," 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. © 2016 A. P. Dasgupta et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
NASA Astrophysics Data System (ADS)
Rozhaeva, K.
2018-01-01
The aim of the researchis the quality operations of the design process at the stage of research works on the development of active on-Board system of the launch vehicles spent stages descent with liquid propellant rocket engines by simulating the gasification process of undeveloped residues of fuel in the tanks. The design techniques of the gasification process of liquid rocket propellant components residues in the tank to the expense of finding and fixing errors in the algorithm calculation to increase the accuracy of calculation results is proposed. Experimental modelling of the model liquid evaporation in a limited reservoir of the experimental stand, allowing due to the false measurements rejection based on given criteria and detected faults to enhance the results reliability of the experimental studies; to reduce the experiments cost.
ERIC Educational Resources Information Center
Sen, Ceylan; Sezen Vekli, Gülsah
2016-01-01
The aim of this study is to determine the influence of inquiry-based teaching approach on pre-service science teachers' laboratory self-efficacy perceptions and scientific process skills. The quasi experimental model with pre-test-post-test control group design was used as an experimental design in this research. The sample of this study included…
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
A Human-Centered Command and Control (C2) Assessment of an Experimental Campaign Planning Tool
2014-04-01
and control (team without the CPT) groups . The two groups were designed to have an equal number of members; however, one member of the experimental...the researchers to analyze the planning process and outcomes. 3.3 Design and Procedure An experimental versus control group design was implemented...the post -PFnet (figure 16b). Within the PFnets, a concept can be focused on in order to identify how the individual or group is defining or
Analysis of the influence of manufacturing and alignment related errors on an optical tweezer system
NASA Astrophysics Data System (ADS)
Kampmann, R.; Sinzinger, S.
2014-12-01
In this work we present the design process as well as experimental results of an optical system for trapping particles in air. For positioning applications of micro-sized objects onto a glass wafer we developed a highly efficient optical tweezer. The focus of this paper is the iterative design process where we combine classical optics design software with a ray optics based force simulation tool. Thus we can find the best compromise which matches the optical systems restrictions with stable trapping conditions. Furthermore we analyze the influence of manufacturing related tolerances and errors in the alignment process of the optical elements on the optical forces. We present the design procedure for the necessary optical elements as well as experimental results for the aligned system.
Reed Johnson, F; Lancsar, Emily; Marshall, Deborah; Kilambi, Vikram; Mühlbacher, Axel; Regier, Dean A; Bresnahan, Brian W; Kanninen, Barbara; Bridges, John F P
2013-01-01
Stated-preference methods are a class of evaluation techniques for studying the preferences of patients and other stakeholders. While these methods span a variety of techniques, conjoint-analysis methods-and particularly discrete-choice experiments (DCEs)-have become the most frequently applied approach in health care in recent years. Experimental design is an important stage in the development of such methods, but establishing a consensus on standards is hampered by lack of understanding of available techniques and software. This report builds on the previous ISPOR Conjoint Analysis Task Force Report: Conjoint Analysis Applications in Health-A Checklist: A Report of the ISPOR Good Research Practices for Conjoint Analysis Task Force. This report aims to assist researchers specifically in evaluating alternative approaches to experimental design, a difficult and important element of successful DCEs. While this report does not endorse any specific approach, it does provide a guide for choosing an approach that is appropriate for a particular study. In particular, it provides an overview of the role of experimental designs for the successful implementation of the DCE approach in health care studies, and it provides researchers with an introduction to constructing experimental designs on the basis of study objectives and the statistical model researchers have selected for the study. The report outlines the theoretical requirements for designs that identify choice-model preference parameters and summarizes and compares a number of available approaches for constructing experimental designs. The task-force leadership group met via bimonthly teleconferences and in person at ISPOR meetings in the United States and Europe. An international group of experimental-design experts was consulted during this process to discuss existing approaches for experimental design and to review the task force's draft reports. In addition, ISPOR members contributed to developing a consensus report by submitting written comments during the review process and oral comments during two forum presentations at the ISPOR 16th and 17th Annual International Meetings held in Baltimore (2011) and Washington, DC (2012). Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
Optimal Experimental Design for Model Discrimination
ERIC Educational Resources Information Center
Myung, Jay I.; Pitt, Mark A.
2009-01-01
Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…
Yoon, Hyejin; Leitner, Thomas
2014-12-17
Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less
Dalwadi, Chintan; Patel, Gayatri
2016-01-01
The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.
Choi, Du Hyung; Shin, Sangmun; Khoa Viet Truong, Nguyen; Jeong, Seong Hoon
2012-09-01
A robust experimental design method was developed with the well-established response surface methodology and time series modeling to facilitate the formulation development process with magnesium stearate incorporated into hydrophilic matrix tablets. Two directional analyses and a time-oriented model were utilized to optimize the experimental responses. Evaluations of tablet gelation and drug release were conducted with two factors x₁ and x₂: one was a formulation factor (the amount of magnesium stearate) and the other was a processing factor (mixing time), respectively. Moreover, different batch sizes (100 and 500 tablet batches) were also evaluated to investigate an effect of batch size. The selected input control factors were arranged in a mixture simplex lattice design with 13 experimental runs. The obtained optimal settings of magnesium stearate for gelation were 0.46 g, 2.76 min (mixing time) for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The optimal settings for drug release were 0.33 g, 7.99 min for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The exact ratio and mixing time of magnesium stearate could be formulated according to the resulting hydrophilic matrix tablet properties. The newly designed experimental method provided very useful information for characterizing significant factors and hence to obtain optimum formulations allowing for a systematic and reliable experimental design method.
Improving knowledge of garlic paste greening through the design of an experimental strategy.
Aguilar, Miguel; Rincón, Francisco
2007-12-12
The furthering of scientific knowledge depends in part upon the reproducibility of experimental results. When experimental conditions are not set with sufficient precision, the resulting background noise often leads to poorly reproduced and even faulty experiments. An example of the catastrophic consequences of this background noise can be found in the design of strategies for the development of solutions aimed at preventing garlic paste greening, where reported results are contradictory. To avoid such consequences, this paper presents a two-step strategy based on the concept of experimental design. In the first step, the critical factors inherent to the problem are identified, using a 2(III)(7-4) Plackett-Burman experimental design, from a list of seven apparent critical factors (ACF); subsequently, the critical factors thus identified are considered as the factors to be optimized (FO), and optimization is performed using a Box and Wilson experimental design to identify the stationary point of the system. Optimal conditions for preventing garlic greening are examined after analysis of the complex process of green-pigment development, which involves both chemical and enzymatic reactions and is strongly influenced by pH, with an overall pH optimum of 4.5. The critical step in the greening process is the synthesis of thiosulfinates (allicin) from cysteine sulfoxides (alliin). Cysteine inhibits the greening process at this critical stage; no greening precursors are formed in the presence of around 1% cysteine. However, the optimal conditions for greening prevention are very sensitive both to the type of garlic and to manufacturing conditions. This suggests that optimal solutions for garlic greening prevention should be sought on a case-by-case basis, using the strategy presented here.
14 CFR § 1240.102 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... experimental or beta phase of development, that performs in accordance with its specifications, and includes... mathematical, engineering or scientific concept, idea, design, process, or product. (h) Innovator means any..., method, process, machine, manufacture, design, or composition of matter, or any new and useful...
Evaluation of selected chemical processes for production of low-cost silicon, phase 3
NASA Technical Reports Server (NTRS)
Blocher, J. M., Jr.; Browning, M. F.; Seifert, D. A.
1981-01-01
A Process Development Unit (PDU), which consisted of the four major units of the process, was designed, installed, and experimentally operated. The PDU was sized to 50MT/Yr. The deposition took place in a fluidized bed reactor. As a consequences of the experiments, improvements in the design an operation of these units were undertaken and their experimental limitations were partially established. A parallel program of experimental work demonstrated that Zinc can be vaporized for introduction into the fluidized bed reactor, by direct induction-coupled r.f. energy. Residual zinc in the product can be removed by heat treatment below the melting point of silicon. Current efficiencies of 94 percent and above, and power efficiencies around 40 percent are achievable in the laboratory-scale electrolysis of ZnCl2.
NASA Technical Reports Server (NTRS)
1981-01-01
Technical readiness for the production of photovoltaic modules using single crystal silicon dendritic web sheet material is demonstrated by: (1) selection, design and implementation of solar cell and photovoltaic module process sequence in a Module Experimental Process System Development Unit; (2) demonstration runs; (3) passing of acceptance and qualification tests; and (4) achievement of a cost effective module.
Propellant injection systems and processes
NASA Technical Reports Server (NTRS)
Ito, Jackson I.
1995-01-01
The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.
Liu, Yu; Zhang, Zhongkai; Lei, Jiuhou; Cao, Jinxiang; Yu, Pengcheng; Zhang, Xiao; Xu, Liang; Zhao, Yaodong
2016-09-01
In this work, the design and construction of the Keda Space Plasma EXperiment (KSPEX), which aims to study the boundary layer processes of ionospheric depletions, are described in detail. The device is composed of three stainless-steel sections: two source chambers at both ends and an experimental chamber in the center. KSPEX is a steady state experimental device, in which hot filament arrays are used to produce plasmas in the two sources. A Macor-mesh design is adopted to adjust the plasma density and potential difference between the two plasmas, which creates a boundary layer with a controllable electron density gradient and inhomogeneous radial electric field. In addition, attachment chemicals can be released into the plasmas through a tailor-made needle valve which leads to the generation of negative ions plasmas. Ionospheric depletions can be modeled and simulated using KSPEX, and many micro-physical processes of the formation and evolution of an ionospheric depletion can be experimentally studied.
Djuris, Jelena; Medarevic, Djordje; Krstic, Marko; Djuric, Zorica; Ibric, Svetlana
2013-06-01
This study illustrates the application of experimental design and multivariate data analysis in defining design space for granulation and tableting processes. According to the quality by design concepts, critical quality attributes (CQAs) of granules and tablets, as well as critical parameters of granulation and tableting processes, were identified and evaluated. Acetaminophen was used as the model drug, and one of the study aims was to investigate the possibility of the development of immediate- or extended-release acetaminophen tablets. Granulation experiments were performed in the fluid bed processor using polyethylene oxide polymer as a binder in the direct granulation method. Tablets were compressed in the laboratory excenter tablet press. The first set of experiments was organized according to Plackett-Burman design, followed by the full factorial experimental design. Principal component analysis and partial least squares regression were applied as the multivariate analysis techniques. By using these different methods, CQAs and process parameters were identified and quantified. Furthermore, an in-line method was developed to monitor the temperature during the fluidized bed granulation process, to foresee possible defects in granules CQAs. Various control strategies that are based on the process understanding and assure desired quality attributes of the product are proposed. Copyright © 2013 Wiley Periodicals, Inc.
Experimental research of solid waste drying in the process of thermal processing
NASA Astrophysics Data System (ADS)
Bukhmirov, V. V.; Kolibaba, O. B.; Gabitov, R. N.
2015-10-01
The convective drying process of municipal solid waste layer as a polydispersed multicomponent porous structure is studied. On the base of the experimental data criterial equations for calculating heat transfer and mass transfer processes in the layer, depending on the humidity of the material, the speed of the drying agent and the layer height are obtained. These solutions are used in the thermal design of reactors for the thermal processing of multicomponent organic waste.
ERIC Educational Resources Information Center
Cheng, Ming-Chang; Chou, Pei-I; Wang, Ya-Ting; Lin, Chih-Ho
2015-01-01
This study investigates how the illustrations in a science textbook, with their design modified according to cognitive process principles, affected students' learning performance. The quasi-experimental design recruited two Grade 5 groups (N?=?58) as the research participants. The treatment group (n?=?30) used the modified version of the textbook,…
NASA Astrophysics Data System (ADS)
Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa
2015-10-01
A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.
NASA Technical Reports Server (NTRS)
1981-01-01
This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental Process System Development Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.
14 CFR 1240.102 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Contributions Board. (d) Commercial quality refers to computer software that is not in an experimental or beta..., engineering or scientific concept, idea, design, process, or product, reported as new technology on NASA Form...) Invention includes any act, method, process, machine, manufacture, design, or composition of matter, or any...
14 CFR 1240.102 - Definitions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Contributions Board. (d) Commercial quality refers to computer software that is not in an experimental or beta..., engineering or scientific concept, idea, design, process, or product, reported as new technology on NASA Form...) Invention includes any act, method, process, machine, manufacture, design, or composition of matter, or any...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seapan, M.; Crynes, B.L.; Dale, S.
The objectives of this study were to analyze alternate crudes kinetic hydrotreatment data in the literature, develop a mathematical model for interpretation of these data, develop an experimental procedure and apparatus to collect accurate kinetic data, and finally, to combine the model and experimental data to develop a general model which, with a few experimental parameters, could be used in design of future hydrotreatment processes. These objectives were to cover a four year program (1980 to 1984) and were subjective to sufficient funding. Only partial funding has been available thus far to cover activities for two years. A hydrotreatment datamore » base is developed which contains over 2000 citations, stored in a microcomputer. About 50% of these are reviewed, classified and can be identified by feedstock, catalyst, reactor type and other process characteristics. Tests of published hydrodesulfurization data indicate the problems with simple n-th order, global kinetic models, and point to the value of developing intrinsic reaction kinetic models to describe the reaction processes. A Langmuir-Hinshelwood kinetic model coupled with a plug flow reactor design equation has been developed and used for published data evaluation. An experimental system and procedure have been designed and constructed, which can be used for kinetic studies. 30 references, 4 tables.« less
AMPS data management concepts. [Atmospheric, Magnetospheric and Plasma in Space experiment
NASA Technical Reports Server (NTRS)
Metzelaar, P. N.
1975-01-01
Five typical AMPS experiments were formulated to allow simulation studies to verify data management concepts. Design studies were conducted to analyze these experiments in terms of the applicable procedures, data processing and displaying functions. Design concepts for AMPS data management system are presented which permit both automatic repetitive measurement sequences and experimenter-controlled step-by-step procedures. Extensive use is made of a cathode ray tube display, the experimenters' alphanumeric keyboard, and the computer. The types of computer software required by the system and the possible choices of control and display procedures available to the experimenter are described for several examples. An electromagnetic wave transmission experiment illustrates the methods used to analyze data processing requirements.
GTE blade injection moulding modeling and verification of models during process approbation
NASA Astrophysics Data System (ADS)
Stepanenko, I. S.; Khaimovich, A. I.
2017-02-01
The simulation model for filling the mould was developed using Moldex3D, and it was experimentally verified in order to perform further optimization calculations of the moulding process conditions. The method described in the article allows adjusting the finite-element model by minimizing the airfoil profile difference between the design and experimental melt motion front due to the differentiated change of power supplied to heating elements, which heat the injection mould in simulation. As a result of calibrating the injection mould for the gas-turbine engine blade, the mean difference between the design melt motion profile and the experimental airfoil profile of no more than 4% was achieved.
ERIC Educational Resources Information Center
Ro¨sch, Esther S.; Helmerdig, Silke
2017-01-01
Early photography processes were predestined to combine chemistry and art. William Henry Fox Talbot is one of the early photography pioneers. In 2-3 day workshops, design students without a major background in chemistry are able to define a reproducible protocol for Talbot's gallic acid containing calotype process. With the experimental concept…
The Effect of Task Characteristics on the Availability Heuristic for Judgments under Uncertainty.
1983-05-01
RICE UNIVERSITY Houston, Texas 7 7001 * Department of Psycholgsy ~ A0 Reserch Report Series dAOTbik has eD dis~fuomis =liited. -~~~ ~ - - 0...The experimental design Involved the manipulation of event characteristics in order to induce a heuristic processing strategy for designated available...40 References .... .......... .o.................. o...................... 41 Appendix A. Experimental Questionnaires
Code of Federal Regulations, 2010 CFR
2010-10-01
... designed for use in a succession of experimental programs over a longer period of time. Examples of loop...) Experimental development of equipment, processes, or devices, including assembly, fitting, installation... for the purpose of conducting a test or experiment. The design may be only conceptual in character...
Code of Federal Regulations, 2013 CFR
2013-10-01
... designed for use in a succession of experimental programs over a longer period of time. Examples of loop...) Experimental development of equipment, processes, or devices, including assembly, fitting, installation... for the purpose of conducting a test or experiment. The design may be only conceptual in character...
Code of Federal Regulations, 2012 CFR
2012-10-01
... designed for use in a succession of experimental programs over a longer period of time. Examples of loop...) Experimental development of equipment, processes, or devices, including assembly, fitting, installation... for the purpose of conducting a test or experiment. The design may be only conceptual in character...
Code of Federal Regulations, 2011 CFR
2011-10-01
... designed for use in a succession of experimental programs over a longer period of time. Examples of loop...) Experimental development of equipment, processes, or devices, including assembly, fitting, installation... for the purpose of conducting a test or experiment. The design may be only conceptual in character...
Code of Federal Regulations, 2014 CFR
2014-10-01
... designed for use in a succession of experimental programs over a longer period of time. Examples of loop...) Experimental development of equipment, processes, or devices, including assembly, fitting, installation... for the purpose of conducting a test or experiment. The design may be only conceptual in character...
Considering RNAi experimental design in parasitic helminths.
Dalzell, Johnathan J; Warnock, Neil D; McVeigh, Paul; Marks, Nikki J; Mousley, Angela; Atkinson, Louise; Maule, Aaron G
2012-04-01
Almost a decade has passed since the first report of RNA interference (RNAi) in a parasitic helminth. Whilst much progress has been made with RNAi informing gene function studies in disparate nematode and flatworm parasites, substantial and seemingly prohibitive difficulties have been encountered in some species, hindering progress. An appraisal of current practices, trends and ideals of RNAi experimental design in parasitic helminths is both timely and necessary for a number of reasons: firstly, the increasing availability of parasitic helminth genome/transcriptome resources means there is a growing need for gene function tools such as RNAi; secondly, fundamental differences and unique challenges exist for parasite species which do not apply to model organisms; thirdly, the inherent variation in experimental design, and reported difficulties with reproducibility undermine confidence. Ideally, RNAi studies of gene function should adopt standardised experimental design to aid reproducibility, interpretation and comparative analyses. Although the huge variations in parasite biology and experimental endpoints make RNAi experimental design standardization difficult or impractical, we must strive to validate RNAi experimentation in helminth parasites. To aid this process we identify multiple approaches to RNAi experimental validation and highlight those which we deem to be critical for gene function studies in helminth parasites.
Downstream processing from hot-melt extrusion towards tablets: A quality by design approach.
Grymonpré, W; Bostijn, N; Herck, S Van; Verstraete, G; Vanhoorne, V; Nuhn, L; Rombouts, P; Beer, T De; Remon, J P; Vervaet, C
2017-10-05
Since the concept of continuous processing is gaining momentum in pharmaceutical manufacturing, a thorough understanding on how process and formulation parameters can impact the critical quality attributes (CQA) of the end product is more than ever required. This study was designed to screen the influence of process parameters and drug load during HME on both extrudate properties and tableting behaviour of an amorphous solid dispersion formulation using a quality-by-design (QbD) approach. A full factorial experimental design with 19 experiments was used to evaluate the effect of several process variables (barrel temperature: 160-200°C, screw speed: 50-200rpm, throughput: 0.2-0.5kg/h) and drug load (0-20%) as formulation parameter on the hot-melt extrusion (HME) process, extrudate and tablet quality of Soluplus ® -Celecoxib amorphous solid dispersions. A prominent impact of the formulation parameter on the CQA of the extrudates (i.e. solid state properties, moisture content, particle size distribution) and tablets (i.e. tabletability, compactibility, fragmentary behaviour, elastic recovery) was discovered. The resistance of the polymer matrix to thermo-mechanical stress during HME was confirmed throughout the experimental design space. In addition, the suitability of Raman spectroscopy as verification method for the active pharmaceutical ingredient (API) concentration in solid dispersions was evaluated. Incorporation of the Raman spectroscopy data in a PLS model enabled API quantification in the extrudate powders with none of the DOE-experiments resulting in extrudates with a CEL content deviating>3% of the label claim. This research paper emphasized that HME is a robust process throughout the experimental design space for obtaining amorphous glassy solutions and for tabletting of such formulations since only minimal impact of the process parameters was detected on the extrudate and tablet properties. However, the quality of extrudates and tablets can be optimized by adjusting specific formulations parameters (e.g. drug load). Copyright © 2017 Elsevier B.V. All rights reserved.
Design and Analysis of AN Static Aeroelastic Experiment
NASA Astrophysics Data System (ADS)
Hou, Ying-Yu; Yuan, Kai-Hua; Lv, Ji-Nan; Liu, Zi-Qiang
2016-06-01
Static aeroelastic experiments are very common in the United States and Russia. The objective of static aeroelastic experiments is to investigate deformation and loads of elastic structure in flow field. Generally speaking, prerequisite of this experiment is that the stiffness distribution of structure is known. This paper describes a method for designing experimental models, in the case where the stiffness distribution and boundary condition of a real aircraft are both uncertain. The stiffness distribution form of the structure can be calculated via finite element modeling and simulation calculation and F141 steels and rigid foam are used to make elastic model. In this paper, the design and manufacturing process of static aeroelastic models is presented and a set of experiment model was designed to simulate the stiffness of the designed wings, a set of experiments was designed to check the results. The test results show that the experimental method can effectively complete the design work of elastic model. This paper introduces the whole process of the static aeroelastic experiment, and the experimental results are analyzed. This paper developed a static aeroelasticity experiment technique and established an experiment model targeting at the swept wing of a certain kind of large aspect ratio aircraft.
Practicing Engineering While Building with Blocks: Identifying Engineering Thinking
ERIC Educational Resources Information Center
Bagiati, Aikaterini; Evangelou, Demetra
2016-01-01
Children's free play naturally enhances skills of observation, communication, experimentation, as well as development of rationale and construction skills. These domains, while synthesised, can lead to the development of certain process models regarding the way constructions could be designed, built and improved. The Design Process model…
NASA Technical Reports Server (NTRS)
1983-01-01
The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.
Didier, Caroline; Forno, Guillermina; Etcheverrigaray, Marina; Kratje, Ricardo; Goicoechea, Héctor
2009-09-21
The optimal blends of six compounds that should be present in culture media used in recombinant protein production were determined by means of artificial neural networks (ANN) coupled with crossed mixture experimental design. This combination constitutes a novel approach to develop a medium for cultivating genetically engineered mammalian cells. The compounds were collected in two mixtures of three elements each, and the experimental space was determined by a crossed mixture design. Empirical data from 51 experimental units were used in a multiresponse analysis to train artificial neural networks which satisfy different requirements, in order to define two new culture media (Medium 1 and Medium 2) to be used in a continuous biopharmaceutical production process. These media were tested in a bioreactor to produce a recombinant protein in CHO cells. Remarkably, for both predicted media all responses satisfied the predefined goals pursued during the analysis, except in the case of the specific growth rate (mu) observed for Medium 1. ANN analysis proved to be a suitable methodology to be used when dealing with complex experimental designs, as frequently occurs in the optimization of production processes in the biotechnology area. The present work is a new example of the use of ANN for the resolution of a complex, real life system, successfully employed in the context of a biopharmaceutical production process.
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Material, process, and product design of thermoplastic composite materials
NASA Astrophysics Data System (ADS)
Dai, Heming
Thermoplastic composites made of polypropylene (PP) and E-glass fibers were investigated experimentally as well as theoretically for two new classes of product designs. The first application was for reinforcement of wood. Commingled PP/glass yarn was consolidated and bonded on wood panel using a tie layer. The processing parameters, including temperature, pressure, heating time, cooling time, bonding strength, and bending strength were tested experimentally and evaluated analytically. The thermoplastic adhesive interface was investigated with environmental scanning electron microscopy. The wood/composite structural design was optimized and evaluated using a Graphic Method. In the second application, we evaluated use of thermoplastic composites for explosion containment in an arrester. PP/glass yarn was fabricated in a sleeve form and wrapped around the arrester. After consolidation, the flexible composite sleeve forms a solid composite shell. The composite shell acts as a protection layer in a surge test to contain the fragments of the arrester. The manufacturing process for forming the composite shell was designed. Woven, knitted, and braided textile composite shells made of commingled PP/glass yarn were tested and evaluated. Mechanical performance of the woven, knitted, and braided composite shells was examined analytically. The theoretical predictions were used to verify the experimental results.
Double jeopardy in inferring cognitive processes
Fific, Mario
2014-01-01
Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545
ERIC Educational Resources Information Center
Ting, Kan Lin; Siew, Nyet Moi
2014-01-01
The purpose of this study was to investigate the effects of outdoor school ground lessons on Year Five students' science process skills and scientific curiosity. A quasi-experimental design was employed in this study. The participants in the study were divided into two groups, one subjected to the experimental treatment, defined as…
Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalá, Manel
2011-10-01
This work was conducted in the framework of a quality by design project involving the production of a pharmaceutical gel. Preliminary work included the identification of the quality target product profiles (QTPPs) from historical values for previously manufactured batches, as well as the critical quality attributes for the process (viscosity and pH), which were used to construct a D-optimal experimental design. The experimental design comprised 13 gel batches, three of which were replicates at the domain center intended to assess the reproducibility of the target process. The viscosity and pH models established exhibited very high linearity and negligible lack of fit (LOF). Thus, R(2) was 0.996 for viscosity and 0.975 for pH, and LOF was 0.53 for the former parameter and 0.84 for the latter. The process proved reproducible at the domain center. Water content and temperature were the most influential factors for viscosity, and water content and acid neutralized fraction were the most influential factors for pH. A desirability function was used to find the best compromise to optimize the QTPPs. The body of information was used to identify and define the design space for the process. A model capable of combining the two response variables into a single one was constructed to facilitate monitoring of the process. Copyright © 2011 Wiley-Liss, Inc.
Patwardhan, Ketaki; Asgarzadeh, Firouz; Dassinger, Thomas; Albers, Jessica; Repka, Michael A
2015-05-01
In this study, the principles of quality by design (QbD) have been uniquely applied to a pharmaceutical melt extrusion process for an immediate release formulation with a low melting model drug, ibuprofen. Two qualitative risk assessment tools - Fishbone diagram and failure mode effect analysis - were utilized to strategically narrow down the most influential parameters. Selected variables were further assessed using a Plackett-Burman screening study, which was upgraded to a response surface design consisting of the critical factors to study the interactions between the study variables. In process torque, glass transition temperature (Tg ) of the extrudates, assay, dissolution and phase change were measured as responses to evaluate the critical quality attributes (CQAs) of the extrudates. The effect of each study variable on the measured responses was analysed using multiple regression for the screening design and partial least squares for the optimization design. Experimental limits for formulation and process parameters to attain optimum processing have been outlined. A design space plot describing the domain of experimental variables within which the CQAs remained unchanged was developed. A comprehensive approach for melt extrusion product development based on the QbD methodology has been demonstrated. Drug loading concentrations between 40- 48%w/w and extrusion temperature in the range of 90-130°C were found to be the most optimum. © 2015 Royal Pharmaceutical Society.
Efficient experimental design for uncertainty reduction in gene regulatory networks.
Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R
2015-01-01
An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.
Efficient experimental design for uncertainty reduction in gene regulatory networks
2015-01-01
Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515
High School Student Information Access and Engineering Design Performance
ERIC Educational Resources Information Center
Mentzer, Nathan
2014-01-01
Developing solutions to engineering design problems requires access to information. Research has shown that appropriately accessing and using information in the design process improves solution quality. This quasi-experimental study provides two groups of high school students with a design problem in a three hour design experience. One group has…
Using experimental design to define boundary manikins.
Bertilsson, Erik; Högberg, Dan; Hanson, Lars
2012-01-01
When evaluating human-machine interaction it is central to consider anthropometric diversity to ensure intended accommodation levels. A well-known method is the use of boundary cases where manikins with extreme but likely measurement combinations are derived by mathematical treatment of anthropometric data. The supposition by that method is that the use of these manikins will facilitate accommodation of the expected part of the total, less extreme, population. In literature sources there are differences in how many and in what way these manikins should be defined. A similar field to the boundary case method is the use of experimental design in where relationships between affecting factors of a process is studied by a systematic approach. This paper examines the possibilities to adopt methodology used in experimental design to define a group of manikins. Different experimental designs were adopted to be used together with a confidence region and its axes. The result from the study shows that it is possible to adapt the methodology of experimental design when creating groups of manikins. The size of these groups of manikins depends heavily on the number of key measurements but also on the type of chosen experimental design.
DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology
Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng
2015-01-01
Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437
Experimental demonstration of the anti-maser
NASA Astrophysics Data System (ADS)
Mazzocco, Anthony; Aviles, Michael; Andrews, Jim; Dawson, Nathan; Crescimanno, Michael
2012-10-01
We denote by ``anti-maser'' a coherent perfect absorption (CPA) process in the radio frequency domain. We demonstrate several experimental realizations of the anti-maser suitable for an advanced undergraduate laboratory. Students designed, assembled and tested these devices, as well as the inexpensive laboratory setup and experimental protocol for displaying various CPA phenomenon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nabeel A. Riza
The goals of the first six months of this project were to begin laying the foundations for both the SiC front-end optical chip fabrication techniques for high pressure gas species sensing as well as the design, assembly, and test of a portable high pressure high temperature calibration test cell chamber for introducing gas species. This calibration cell will be used in the remaining months for proposed first stage high pressure high temperature gas species sensor experimentation and data processing. All these goals have been achieved and are described in detail in the report. Both design process and diagrams for themore » mechanical elements as well as the optical systems are provided. Photographs of the fabricated calibration test chamber cell, the optical sensor setup with the calibration cell, the SiC sample chip holder, and relevant signal processing mathematics are provided. Initial experimental data from both the optical sensor and fabricated test gas species SiC chips is provided. The design and experimentation results are summarized to give positive conclusions on the proposed novel high temperature high pressure gas species detection optical sensor technology.« less
NASA Astrophysics Data System (ADS)
Zheng, Jigui; Huang, Yuping; Wu, Hongxing; Zheng, Ping
2016-07-01
Transverse-flux with high efficiency has been applied in Stirling engine and permanent magnet synchronous linear generator system, however it is restricted for large application because of low and complex process. A novel type of cylindrical, non-overlapping, transverse-flux, and permanent-magnet linear motor(TFPLM) is investigated, furthermore, a high power factor and less process complexity structure research is developed. The impact of magnetic leakage factor on power factor is discussed, by using the Finite Element Analysis(FEA) model of stirling engine and TFPLM, an optimization method for electro-magnetic design of TFPLM is proposed based on magnetic leakage factor. The relation between power factor and structure parameter is investigated, and a structure parameter optimization method is proposed taking power factor maximum as a goal. At last, the test bench is founded, starting experimental and generating experimental are performed, and a good agreement of simulation and experimental is achieved. The power factor is improved and the process complexity is decreased. This research provides the instruction to design high-power factor permanent-magnet linear generator.
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide a useful engineering technology base in the development of hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed and manufactured for conducting experimental investigations. Oxidizer (LOX or GOX) supply and control systems have been designed and partly constructed for the head-end injection into the test chamber. Experiments using HTPB fuel, as well as fuels supplied by NASA designated industrial companies will be conducted. Design and construction of fuel casting molds and sample holders have been completed. The portion of these items for industrial company fuel casting will be sent to the McDonnell Douglas Aerospace Corporation in the near future. The study focuses on the following areas: observation of solid fuel burning processes with LOX or GOX, measurement and correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study (Part 2) also being conducted at PSU.
NASA Technical Reports Server (NTRS)
Ankenman, Bruce; Ermer, Donald; Clum, James A.
1994-01-01
Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.
Implications of groundwater hydrology to buffer design in the southeastern U.S.
Ge Sun; James M. Vose; Devendra M. Amatya; Carl Trettin; Steven G. McNulty
2008-01-01
The objective of this study was to examine the hydrologic processes of shallow groundwater to better define and design forest riparian management zones in headwater streams of two contrasting terrains in the southeastern U.S. We employed two long-term experimental watersheds, WS80 (206 ha) and WS77 (151 ha) at the Santee Experimental Forests in South Carolina, and WS2...
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
This workbook is part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. This workbook is designed to orient people who are only recently literate to the world of work and business. Topics covered include worker…
Field-Induced Texturing of Ceramic Materials for Unparalleled Properties
2017-03-01
research for materials-by- design and advanced processing. Invited talk; 17th International Conference on Experimental Mechanics; 2016 Jul; Rhodes...material that could potentially be textured despite its diamagnetic nature. Predictive DFT modeling and experimental testing methods were designed ...presented at the Mater Science Forum; 2007 (unpublished). 71. Sugiyama T, Tahashi M, Sassa K, Asai S. The control of crystal orientation in non -magnetic
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
This textbook is part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. It is designed to teach people with developing literacy skills to participate in a meaningful way in the life of their community. Topics…
System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling
Bacelli, Giorgio; Coe, Ryan; Patterson, David; ...
2017-04-01
Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less
NASA Astrophysics Data System (ADS)
Chatwin, Christopher R.; McDonald, Donald W.; Scott, Brian F.
1989-07-01
The absence of an applications led design philosophy has compromised both the development of laser source technology and its effective implementation into manufacturing technology in particular. For example, CO2 lasers are still incapable of processing classes of refractory and non-ferrous metals. Whilst the scope of this paper is restricted to high power CO2 lasers; the design methodology reported herein is applicable to source technology in general, which when exploited, will effect an expansion of applications. The CO2 laser operational envelope should not only be expanded to incorporate high damage threshold materials but also offer a greater degree of controllability. By a combination of modelling and experimentation the requisite beam characteristics, at the workpiece, were determined then utilised to design the Laser Manufacturing System. The design of sub-system elements was achieved by a combination of experimentation and simulation which benefited from a comprehensive set of software tools. By linking these tools the physical processes in the laser - electron processes in the plasma, the history of photons in the resonator, etc. - can be related, in a detailed model, to the heating mechanisms in the workpiece.
NASA Astrophysics Data System (ADS)
Kehoe, S.; Stokes, J.
2011-03-01
Physicochemical properties of hydroxyapatite (HAp) synthesized by the chemical precipitation method are heavily dependent on the chosen process parameters. A Box-Behnken three-level experimental design was therefore, chosen to determine the optimum set of process parameters and their effect on various HAp characteristics. These effects were quantified using design of experiments (DoE) to develop mathematical models using the Box-Behnken design, in terms of the chemical precipitation process parameters. Findings from this research show that the HAp possessing optimum powder characteristics for orthopedic application via a thermal spray technique can therefore be prepared using the following chemical precipitation process parameters: reaction temperature 60 °C, ripening time 48 h, and stirring speed 1500 rpm using high reagent concentrations. Ripening time and stirring speed significantly affected the final phase purity for the experimental conditions of the Box-Behnken design. An increase in both the ripening time (36-48 h) and stirring speed (1200-1500 rpm) was found to result in an increase of phase purity from 47(±2)% to 85(±2)%. Crystallinity, crystallite size, lattice parameters, and mean particle size were also optimized within the research to find desired settings to achieve results suitable for FDA regulations.
System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacelli, Giorgio; Coe, Ryan; Patterson, David
Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less
An experimental investigation of the effects of alarm processing and display on operator performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Hara, J.; Brown, W.; Hallbert, B.
1998-03-01
This paper describes a research program sponsored by the US Nuclear Regulatory Commission to address the human factors engineering (HFE) aspects of nuclear power plant alarm systems. The overall objective of the program is to develop HFE review guidance for advanced alarm systems. As part of this program, guidance has been developed based on a broad base of technical and research literature. In the course of guidance development, aspects of alarm system design for which the technical basis was insufficient to support complete guidance development were identified. The primary purpose of the research reported in this paper was to evaluatemore » the effects of three of these alarm system design characteristics on operator performance in order to contribute to the understanding of potential safety issues and to provide data to support the development of design review guidance in these areas. Three alarm system design characteristics studied were (1) alarm processing (degree of alarm reduction), (2) alarm availability (dynamic prioritization and suppression), and (3) alarm display (a dedicated tile format, a mixed tile and message list format, and a format in which alarm information is integrated into the process displays). A secondary purpose was to provide confirmatory evidence of selected alarm system guidance developed in an earlier phase of the project. The alarm characteristics were combined into eight separate experimental conditions. Six, two-person crews of professional nuclear power plant operators participated in the study. Following training, each crew completed 16 test trials which consisted of two trials in each of the eight experimental conditions (one with a low-complexity scenario and one with a high-complexity scenario). Measures of process performance, operator task performance, situation awareness, and workload were obtained. In addition, operator opinions and evaluations of the alarm processing and display conditions were collected. No deficient performance was observed in any of the experimental conditions, providing confirmatory support for many design review guidelines. The operators identified numerous strengths and weaknesses associated with individual alarm design characteristics.« less
ERIC Educational Resources Information Center
Kraaijvanger, Richard G.; Veldkamp, Tom
2017-01-01
Purpose: This paper analyses research strategies followed by farmer groups in Tigray, that were involved in participatory experimentation. Understanding choices made by farmers in such experimentation processes is important to understand reasons why farmers in Tigray often hesitated to adopt recommended practices. Design/Methodology/Approach: A…
Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.
ERIC Educational Resources Information Center
Fortune, Jim C.; Hutson, Barbara A.
1984-01-01
Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
PRACTICAL STRATEGIES FOR PROCESSING AND ANALYZING SPOTTED OLIGONUCLEOTIDE MICROARRAY DATA
Thoughtful data analysis is as important as experimental design, biological sample quality, and appropriate experimental procedures for making microarrays a useful supplement to traditional toxicology. In the present study, spotted oligonucleotide microarrays were used to profile...
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei
2015-06-01
To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
Experimental studies of two-stage centrifugal dust concentrator
NASA Astrophysics Data System (ADS)
Vechkanova, M. V.; Fadin, Yu M.; Ovsyannikov, Yu G.
2018-03-01
The article presents data of experimental results of two-stage centrifugal dust concentrator, describes its design, and shows the development of a method of engineering calculation and laboratory investigations. For the experiments, the authors used quartz, ceramic dust and slag. Experimental dispersion analysis of dust particles was obtained by sedimentation method. To build a mathematical model of the process, dust collection was built using central composite rotatable design of the four factorial experiment. A sequence of experiments was conducted in accordance with the table of random numbers. Conclusion were made.
NASA Technical Reports Server (NTRS)
1979-01-01
The feasibility of Union Carbide's silane process for commercial application was established. An integrated process design for an experimental process system development unit and a commercial facility were developed. The corresponding commercial plant economic performance was then estimated.
Experimental research of radio-frequency ion thruster
NASA Astrophysics Data System (ADS)
Antropov, N. N.; Akhmetzhanov, R. V.; Bogatyy, A. V.; Grishin, R. A.; Kozhevnikov, V. V.; Plokhikh, A. P.; Popov, G. A.; Khartov, S. A.
2016-12-01
The article is devoted to the research of low-power (300 W) radio-frequency ion thruster designed at the Moscow Aviation Institute. The main results of experimental research of the thruster using the testfacility power supplies and the power processing unit of their own design are presented. The dependence of the working fluid ionization cost on its mass flow rate at the constant ion beam current was investigated experimentally. The influence of the shape and material of the discharge chamber on the integral characteristics of the thruster was studied. The recommendations on the optimization of the thruster primary performance were developed based on the results of experimental studies.
Isailović, Tanja; Ðorđević, Sanela; Marković, Bojan; Ranđelović, Danijela; Cekić, Nebojša; Lukić, Milica; Pantelić, Ivana; Daniels, Rolf; Savić, Snežana
2016-01-01
We aimed to develop lecithin-based nanoemulsions intended for effective aceclofenac (ACF) skin delivery utilizing sucrose esters [sucrose palmitate (SP) and sucrose stearate (SS)] as additional stabilizers and penetration enhancers. To find the suitable surfactant mixtures and levels of process variables (homogenization pressure and number of cycles - high pressure homogenization manufacturing method) that result in drug-loaded nanoemulsions with minimal droplet size and narrow size distribution, a combined mixture-process experimental design was employed. Based on optimization data, selected nanoemulsions were evaluated regarding morphology, surface charge, drug-excipient interactions, physical stability, and in vivo skin performances (skin penetration and irritation potential). The predicted physicochemical properties and storage stability were proved satisfying for ACF-loaded nanoemulsions containing 2% of SP in the blend with 0%-1% of SS and 1%-2% of egg lecithin (produced at 50°C/20 cycles/800 bar). Additionally, the in vivo tape stripping demonstrated superior ACF skin absorption from these nanoemulsions, particularly from those containing 2% of SP, 0.5% of SS, and 1.5% of egg lecithin, when comparing with the sample costabilized by conventional surfactant - polysorbate 80. In summary, the combined mixture-process experimental design was shown as a feasible tool for formulation development of multisurfactant-based nanosized delivery systems with potentially improved overall product performances.
Title I preliminary engineering for: A. S. E. F. solid waste to methane gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1976-01-01
An assignment to provide preliminary engineering of an Advanced System Experimental Facility for production of methane gas from urban solid waste by anaerobic digestion is documented. The experimental facility will be constructed on a now-existing solid waste shredding and landfill facility in Pompano Beach, Florida. Information is included on: general description of the project; justification of basic need; process design; preliminary drawings; outline specifications; preliminary estimate of cost; and time schedules for design and construction of accomplishment of design and construction. The preliminary cost estimate for the design and construction phases of the experimental program is $2,960,000, based on Dec.more » 1975 and Jan. 1976 costs. A time schedule of eight months to complete the Detailed Design, Equipment Procurement and the Award of Subcontracts is given.« less
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
These workbooks are part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. These workbooks, designed to continue developing literacy skills, include pictures, dialogues, crossword puzzles, and fill-in-the blank…
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Life Design Counseling Group Intervention with Portuguese Adolescents: A Process and Outcome Study
ERIC Educational Resources Information Center
Cardoso, Paulo; Janeiro, Isabel Nunes; Duarte, Maria Eduarda
2018-01-01
This article examines the process and outcome of a life design counseling group intervention with students in Grades 9 and 12. First, we applied a quasi-experimental methodology to analyze the intervention's effectiveness in promoting career certainty, career decision-making, self-efficacy, and career adaptability in a sample of 236 students.…
The Impact of WhatsApp Use on Success in Education Process
ERIC Educational Resources Information Center
Cetinkaya, Levent
2017-01-01
The purpose of this study is to explore the effects of WhatsApp use for education and determine the opinions of students towards the process. The study was designed in mixed research model which combines both qualitative and quantitative data. In the quantitative aspect of the study, quasi-experimental design, with a pretest-posttest control…
2011-09-01
AND EXPERIMENTAL DESIGN ..........................................................................................................31 1...PRIMARY RESERCH QUESTION ............................................................41 C. OBJECTIVE ACHIEVEMENT...Based Outpatient Clinic CPT Cognitive Processing Therapy DISE Distributed Information Systems Experimentation EBT Evidence-Based Treatment GMC
NASA Technical Reports Server (NTRS)
1981-01-01
The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.
Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes
NASA Astrophysics Data System (ADS)
Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.
2018-04-01
To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.
Thermomechanical simulations and experimental validation for high speed incremental forming
NASA Astrophysics Data System (ADS)
Ambrogio, Giuseppina; Gagliardi, Francesco; Filice, Luigino; Romero, Natalia
2016-10-01
Incremental sheet forming (ISF) consists in deforming only a small region of the workspace through a punch driven by a NC machine. The drawback of this process is its slowness. In this study, a high speed variant has been investigated from both numerical and experimental points of view. The aim has been the design of a FEM model able to perform the material behavior during the high speed process by defining a thermomechanical model. An experimental campaign has been performed by a CNC lathe with high speed to test process feasibility. The first results have shown how the material presents the same performance than in conventional speed ISF and, in some cases, better material behavior due to the temperature increment. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process confirming substantially experimental evidence.
Using Learning Analytics to Characterize Student Experimentation Strategies in Engineering Design
ERIC Educational Resources Information Center
Vieira, Camilo; Goldstein, Molly Hathaway; Purzer, Senay; Magana, Alejandra J.
2016-01-01
Engineering design is a complex process both for students to participate in and for instructors to assess. Informed designers use the key strategy of conducting experiments as they test ideas to inform next steps. Conversely, beginning designers experiment less, often with confounding variables. These behaviours are not easy to assess in…
What are you trying to learn? Study designs and the appropriate analysis for your research question
USDA-ARS?s Scientific Manuscript database
One fundamental necessity in the entire process of a well-performed study is the experimental design. A well-designed study can help researchers understand and have confidence in their results and analyses, and additionally the agreement or disagreement with the stated hypothesis. This well-designed...
Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios
2018-05-02
Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.
Experimental Design and Interpretation of Functional Neuroimaging Studies of Cognitive Processes
Caplan, David
2008-01-01
This article discusses how the relation between experimental and baseline conditions in functional neuroimaging studies affects the conclusions that can be drawn from a study about the neural correlates of components of the cognitive system and about the nature and organization of those components. I argue that certain designs in common use—in particular the contrast of qualitatively different representations that are processed at parallel stages of a functional architecture—can never identify the neural basis of a cognitive operation and have limited use in providing information about the nature of cognitive systems. Other types of designs—such as ones that contrast representations that are computed in immediately sequential processing steps and ones that contrast qualitatively similar representations that are parametrically related within a single processing stage—are more easily interpreted. PMID:17979122
Design and validation of the eyesafe ladar testbed (ELT) using the LadarSIM system simulator
NASA Astrophysics Data System (ADS)
Neilsen, Kevin D.; Budge, Scott E.; Pack, Robert T.; Fullmer, R. Rees; Cook, T. Dean
2009-05-01
The development of an experimental full-waveform LADAR system has been enhanced with the assistance of the LadarSIM system simulation software. The Eyesafe LADAR Test-bed (ELT) was designed as a raster scanning, single-beam, energy-detection LADAR with the capability of digitizing and recording the return pulse waveform at up to 2 GHz for 3D off-line image formation research in the laboratory. To assist in the design phase, the full-waveform LADAR simulation in LadarSIM was used to simulate the expected return waveforms for various system design parameters, target characteristics, and target ranges. Once the design was finalized and the ELT constructed, the measured specifications of the system and experimental data captured from the operational sensor were used to validate the behavior of the system as predicted during the design phase. This paper presents the methodology used, and lessons learned from this "design, build, validate" process. Simulated results from the design phase are presented, and these are compared to simulated results using measured system parameters and operational sensor data. The advantages of this simulation-based process are also presented.
Exploring Experimental Design: An Excel-Based Simulation Using Steller Sea Lion Behavior
ERIC Educational Resources Information Center
Ryan, Wendy L.; St. Iago-McRae, Ezry
2016-01-01
Experimentation is the foundation of science and an important process for students to understand and experience. However, it can be difficult to teach some aspects of experimentation within the time and resource constraints of an academic semester. Interactive models can be a useful tool in bridging this gap. This freely accessible simulation…
Model-based high-throughput design of ion exchange protein chromatography.
Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo
2016-08-12
This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.
Manufacturing of tailored tubes with a process integrated heat treatment
NASA Astrophysics Data System (ADS)
Hordych, Illia; Boiarkin, Viacheslav; Rodman, Dmytro; Nürnberger, Florian
2017-10-01
The usage of work-pieces with tailored properties allows for reducing costs and materials. One example are tailored tubes that can be used as end parts e.g. in the automotive industry or in domestic applications as well as semi-finished products for subsequent controlled deformation processes. An innovative technology to manufacture tubes is roll forming with a subsequent inductive heating and adapted quenching to obtain tailored properties in the longitudinal direction. This processing offers a great potential for the production of tubes with a wide range of properties, although this novel approach still requires a suited process design. Based on experimental data, a process simulation is being developed. The simulation shall be suitable for a virtual design of the tubes and allows for gaining a deeper understanding of the required processing. The model proposed shall predict microstructural and mechanical tube properties by considering process parameters, different geometries, batch-related influences etc. A validation is carried out using experimental data of tubes manufactured from various steel grades.
Southern Regional Center for Lightweight Innovative Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Paul T.
The Southern Regional Center for Lightweight Innovative Design (SRCLID) has developed an experimentally validated cradle-to-grave modeling and simulation effort to optimize automotive components in order to decrease weight and cost, yet increase performance and safety in crash scenarios. In summary, the three major objectives of this project are accomplished: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantifymore » microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios. In this final report, we divided the content into two parts: the first part contains the development of building blocks for the project, including materials and process models, process-structure-property (PSP) relationship, and experimental validation capabilities; the second part presents the demonstration task for Mg front-end work associated with USAMP projects.« less
Design and experimental study on desulphurization process of ship exhaust
NASA Astrophysics Data System (ADS)
Han, Mingyang; Hao, Shan; Zhou, Junbo; Gao, Liping
2018-02-01
This desulfurization process involves removing sulfur oxides with seawater or alkaline aqueous solutions and then treating the effluent by aeration and pH adjustment before discharging it into the ocean. In the desulfurization system, the spray tower is the key equipment and the venturi tubes are the pretreatment device. The two stages of plates are designed to fully absorb sulfur oxides in exhaust gases. The spiral nozzles atomize and evenly spray the desulfurizers into the tower. This study experimentally investigated the effectiveness of this desulfurization process and the factors influencing it under laboratory conditions, with a diesel engine exhaust used to represent ship exhaust. The experimental results show that this process can effectively absorb the SO2 in the exhaust. When the exhaust flow rate was 25 m3/h and the desulfurizer flow rate was 4 L/min, the sulfur removal efficiency (SRE) reached 99.7%. The flow rate, alkalinity, and temperature of seawater were found to have significant effects on the SRE. Adjusting seawater flow rate (SWR) and alkalinity within certain ranges can substantially improve the SRE.
Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen
2015-01-01
The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870
NASA Astrophysics Data System (ADS)
Boravelli, Sai Chandra Teja
This thesis mainly focuses on design and process development of a downdraft biomass gasification processes. The objective is to develop a gasifier and process of gasification for a continuous steady state process. A lab scale downdraft gasifier was designed to develop the process and obtain optimum operating procedure. Sustainable and dependable sources such as biomass are potential sources of renewable energy and have a reasonable motivation to be used in developing a small scale energy production plant for countries such as Canada where wood stocks are more reliable sources than fossil fuels. This thesis addresses the process of thermal conversion of biomass gasification process in a downdraft reactor. Downdraft biomass gasifiers are relatively cheap and easy to operate because of their design. We constructed a simple biomass gasifier to study the steady state process for different sizes of the reactor. The experimental part of this investigation look at how operating conditions such as feed rate, air flow, the length of the bed, the vibration of the reactor, height and density of syngas flame in combustion flare changes for different sizes of the reactor. These experimental results also compare the trends of tar, char and syngas production for wood pellets in a steady state process. This study also includes biomass gasification process for different wood feedstocks. It compares how shape, size and moisture content of different feedstocks makes a difference in operating conditions for the gasification process. For this, Six Sigma DMAIC techniques were used to analyze and understand how each feedstock makes a significant impact on the process.
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
NASA Technical Reports Server (NTRS)
Kuo, Kenneth K.; Lu, Y. C.; Chiaverini, Martin J.; Harting, George C.
1994-01-01
An experimental study on the fundamental processes involved in fuel decomposition and boundary layer combustion in hybrid rocket motors is being conducted at the High Pressure Combustion Laboratory of the Pennsylvania State University. This research should provide an engineering technology base for development of large scale hybrid rocket motors as well as a fundamental understanding of the complex processes involved in hybrid propulsion. A high pressure slab motor has been designed for conducting experimental investigations. Oxidizer (LOX or GOX) is injected through the head-end over a solid fuel (HTPB) surface. Experiments using fuels supplied by NASA designated industrial companies will also be conducted. The study focuses on the following areas: measurement and observation of solid fuel burning with LOX or GOX, correlation of solid fuel regression rate with operating conditions, measurement of flame temperature and radical species concentrations, determination of the solid fuel subsurface temperature profile, and utilization of experimental data for validation of a companion theoretical study also being conducted at PSU.
NASA Technical Reports Server (NTRS)
1992-01-01
The Space Station Furnace Facility (SSFF) is a modular facility for materials research in the microgravity environment of the Space Station Freedom (SSF). The SSFF is designed for crystal growth and solidification research in the fields of electronic and photonic materials, metals and alloys, and glasses and ceramics and will allow for experimental determination of the role of gravitational forces in the solidification process. The facility will provide a capability for basic scientific research and will evaluate the commercial viability of low-gravity processing of selected technologically important materials. The facility is designed to support a complement of furnace modules as outlined in the Science Capabilities Requirements Document (SCRD). The SSFF is a three rack facility that provides the functions, interfaces, and equipment necessary for the processing of the furnaces and consists of two main parts: the SSFF Core Rack and the two Experiment Racks. The facility is designed to accommodate two experimenter-provided furnace modules housed within the two experiment racks, and is designed to operate these two furnace modules simultaneously. The SCRD specifies a wide range of furnace requirements and serves as the basis for the SSFF conceptual design. SSFF will support automated processing during the man-tended operations and is also designed for crew interface during the permanently manned configuration. The facility is modular in design and facilitates changes as required, so the SSFF is adept to modifications, maintenance, reconfiguration, and technology evolution.
NASA Astrophysics Data System (ADS)
Koksal, Ela Ayse; Berberoglu, Giray
2014-01-01
The purpose of this study is to investigate the effectiveness of guided-inquiry approach in science classes over existing science and technology curriculum in developing content-based science achievement, science process skills, and attitude toward science of grade level 6 students in Turkey. Non-equivalent control group quasi-experimental design was used to investigate the treatment effect. There were 162 students in the experimental group and 142 students in the control group. Both the experimental and control group students took the Achievement Test in Reproduction, Development, and Growth in Living Things (RDGLT), Science Process Skills Test, and Attitudes Toward Science Questionnaire, as pre-test and post-test. Repeated analysis of variance design was used in analyzing the data. Both the experimental and control group students were taught in RDGLT units for 22 class hours. The results indicated the positive effect of guided-inquiry approach on the Turkish students' cognitive as well as affective characteristics. The guided inquiry enhanced the experimental group students' understandings of the science concepts as well as the inquiry skills more than the control group students. Similarly, the experimental group students improved their attitudes toward science more than the control group students as a result of treatment. The guided inquiry seems a transition between traditional teaching method and student-centred activities in the Turkish schools.
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
These workbooks are part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. The workbooks are designed to teach skills needed to manage ordinary financial transactions and daily tasks requiring a knowledge of…
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
These workbooks are part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. The workbooks, divided in two volumes, are designed to teach skills required in managing ordinary financial transactions and daily tasks…
EXPERIMENTAL MOLTEN-SALT-FUELED 30-Mw POWER REACTOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, L.G.; Kinyon, B.W.; Lackey, M.E.
1960-03-24
A preliminary design study was made of an experimental molten-salt- fueled power reactor. The reactor considered is a single-region homogeneous burner coupled with a Loeffler steam-generating cycle. Conceptual plant layouts, basic information on the major fuel circuit components, a process flowsheet, and the nuclear characteristics of the core are presented. The design plant electrical output is 10 Mw, and the total construction cost is estimated to be approximately ,000,000. (auth)
The Evolution of a Connectionist Model of Situated Human Language Understanding
NASA Astrophysics Data System (ADS)
Mayberry, Marshall R.; Crocker, Matthew W.
The Adaptive Mechanisms in Human Language Processing (ALPHA) project features both experimental and computational tracks designed to complement each other in the investigation of the cognitive mechanisms that underlie situated human utterance processing. The models developed in the computational track replicate results obtained in the experimental track and, in turn, suggest further experiments by virtue of behavior that arises as a by-product of their operation.
Gronau, Greta; Krishnaji, Sreevidhya T; Kinahan, Michelle E; Giesa, Tristan; Wong, Joyce Y; Kaplan, David L; Buehler, Markus J
2012-11-01
Tailored biomaterials with tunable functional properties are desirable for many applications ranging from drug delivery to regenerative medicine. To improve the predictability of biopolymer materials functionality, multiple design parameters need to be considered, along with appropriate models. In this article we review the state of the art of synthesis and processing related to the design of biopolymers, with an emphasis on the integration of bottom-up computational modeling in the design process. We consider three prominent examples of well-studied biopolymer materials - elastin, silk, and collagen - and assess their hierarchical structure, intriguing functional properties and categorize existing approaches to study these materials. We find that an integrated design approach in which both experiments and computational modeling are used has rarely been applied for these materials due to difficulties in relating insights gained on different length- and time-scales. In this context, multiscale engineering offers a powerful means to accelerate the biomaterials design process for the development of tailored materials that suit the needs posed by the various applications. The combined use of experimental and computational tools has a very broad applicability not only in the field of biopolymers, but can be exploited to tailor the properties of other polymers and composite materials in general. Copyright © 2012 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Gosper, Maree
2014-01-01
This paper introduces the MAPLET framework that was developed to map and link teaching aims, learning processes, learner expertise and technologies. An experimental study with 65 participants is reported to test the effectiveness of the framework as a guide to the design of lessons embedded within larger units of study. The findings indicate the…
Evaluating two process scale chromatography column header designs using CFD.
Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris
2014-01-01
Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. © 2014 American Institute of Chemical Engineers.
Can Reflection Boost Competences Development in Organizations?
ERIC Educational Resources Information Center
Nansubuga, Florence; Munene, John C.; Ntayi, Joseph M.
2015-01-01
Purpose: The purpose of this paper is to examine the gaps in some existing competence frameworks and investigate the power of reflection on one's behavior to improve the process of the competences development. Design/methodology/approach: The authors used a correlational design and a quasi-experimental non-equivalent group design involving a…
Strategic Teaching: Student Learning through Working the Process
ERIC Educational Resources Information Center
Spanbroek, Nancy
2010-01-01
The designers of our future built environment must possess intellectual tools which will allow them to be disciplined, flexible and analytical thinkers, able to address and resolve new and complex problems. In response, an experimental and collaborative design studio was designed to inspire and build on students' knowledge and their creative…
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...
Van Daele, Timothy; Gernaey, Krist V; Ringborg, Rolf H; Börner, Tim; Heintz, Søren; Van Hauwermeiren, Daan; Grey, Carl; Krühne, Ulrich; Adlercreutz, Patrick; Nopens, Ingmar
2017-09-01
The aim of model calibration is to estimate unique parameter values from available experimental data, here applied to a biocatalytic process. The traditional approach of first gathering data followed by performing a model calibration is inefficient, since the information gathered during experimentation is not actively used to optimize the experimental design. By applying an iterative robust model-based optimal experimental design, the limited amount of data collected is used to design additional informative experiments. The algorithm is used here to calibrate the initial reaction rate of an ω-transaminase catalyzed reaction in a more accurate way. The parameter confidence region estimated from the Fisher Information Matrix is compared with the likelihood confidence region, which is not only more accurate but also a computationally more expensive method. As a result, an important deviation between both approaches is found, confirming that linearization methods should be applied with care for nonlinear models. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:1278-1293, 2017. © 2017 American Institute of Chemical Engineers.
NASA Technical Reports Server (NTRS)
Agnone, A. M.
1972-01-01
The factors affecting a tangential fuel injector design for scramjet operation are reviewed and their effect on the efficiency of the supersonic combustion process is evaluated using both experimental data and theoretical predictions. A description of the physical problem of supersonic combustion and method of analysis is followed by a presentation and evaluation of some standard and exotic types of fuel injectors. Engineering fuel injector design criteria and hydrogen ignition schemes are presented along with a cursory review of available experimental data. A two-dimensional tangential fuel injector design is developed using analyses as a guide in evaluating the effects on the combustion process of various initial and boundary conditions including splitter plate thickness, injector wall temperature, pressure gradients, etc. The fuel injector wall geometry is shaped so as to maintain approximately constant pressure at the flame as required by a cycle analysis. A viscous characteristics program which accounts for lateral as well as axial pressure variations due to the mixing and combustion process is used in determining the wall geometry.
NASA Astrophysics Data System (ADS)
Miyajima, Hiroyuki; Yuhara, Naohiro
Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.
Materials-by-design: computation, synthesis, and characterization from atoms to structures
NASA Astrophysics Data System (ADS)
Yeo, Jingjie; Jung, Gang Seob; Martín-Martínez, Francisco J.; Ling, Shengjie; Gu, Grace X.; Qin, Zhao; Buehler, Markus J.
2018-05-01
In the 50 years that succeeded Richard Feynman’s exposition of the idea that there is ‘plenty of room at the bottom’ for manipulating individual atoms for the synthesis and manufacturing processing of materials, the materials-by-design paradigm is being developed gradually through synergistic integration of experimental material synthesis and characterization with predictive computational modeling and optimization. This paper reviews how this paradigm creates the possibility to develop materials according to specific, rational designs from the molecular to the macroscopic scale. We discuss promising techniques in experimental small-scale material synthesis and large-scale fabrication methods to manipulate atomistic or macroscale structures, which can be designed by computational modeling. These include recombinant protein technology to produce peptides and proteins with tailored sequences encoded by recombinant DNA, self-assembly processes induced by conformational transition of proteins, additive manufacturing for designing complex structures, and qualitative and quantitative characterization of materials at different length scales. We describe important material characterization techniques using numerous methods of spectroscopy and microscopy. We detail numerous multi-scale computational modeling techniques that complements these experimental techniques: DFT at the atomistic scale; fully atomistic and coarse-grain molecular dynamics at the molecular to mesoscale; continuum modeling at the macroscale. Additionally, we present case studies that utilize experimental and computational approaches in an integrated manner to broaden our understanding of the properties of two-dimensional materials and materials based on silk and silk-elastin-like proteins.
Using system dynamics for collaborative design: a case study
Elf, Marie; Putilova, Mariya; von Koch, Lena; Öhrn, Kerstin
2007-01-01
Background In order to facilitate the collaborative design, system dynamics (SD) with a group modelling approach was used in the early stages of planning a new stroke unit. During six workshops a SD model was created in a multiprofessional group. Aim To explore to which extent and how the use of system dynamics contributed to the collaborative design process. Method A case study was conducted using several data sources. Results SD supported a collaborative design, by facilitating an explicit description of stroke care process, a dialogue and a joint understanding. The construction of the model obliged the group to conceptualise the stroke care and experimentation with the model gave the opportunity to reflect on care. Conclusion SD facilitated the collaborative design process and should be integrated in the early stages of the design process as a quality improvement tool. PMID:17683519
Seat pressure measurement technologies: considerations for their evaluation.
Gyi, D E; Porter, J M; Robertson, N K
1998-04-01
Interface pressure measurement has generated interest in the automotive industry as a technique which could be used in the prediction of driver discomfort for various car seat designs, and provide designers and manufacturers with rapid information early on in the design process. It is therefore essential that the data obtained are of the highest quality, relevant and have some quantitative meaning. Exploratory experimental work carried out with the commercially available Talley Pressure Monitor is outlined. This led to a better understanding of the strengths and weaknesses of this system and the re-design of the sensor matrix. Such evaluation, in the context of the actual experimental environment, is considered essential.
A processing centre for the CNES CE-GPS experimentation
NASA Technical Reports Server (NTRS)
Suard, Norbert; Durand, Jean-Claude
1994-01-01
CNES is involved in a GPS (Global Positioning System) geostationary overlay experimentation. The purpose of this experimentation is to test various new techniques in order to select the optimal station synchronization method, as well as the geostationary spacecraft orbitography method. These new techniques are needed to develop the Ranging GPS Integrity Channel services. The CNES experimentation includes three transmitting/receiving ground stations (manufactured by IN-SNEC), one INMARSAT 2 C/L band transponder and a processing center named STE (Station de Traitements de l'Experimentation). Not all the techniques to be tested are implemented, but the experimental system has to include several functions; part of the future system simulation functions, such as a servo-loop function, and in particular a data collection function providing for rapid monitoring of system operation, analysis of existing ground station processes, and several weeks of data coverage for other scientific studies. This paper discusses system architecture and some criteria used in its design, as well as the monitoring function, the approach used to develop a low-cost and short-life processing center in collaboration with a CNES sub-contractor (ATTDATAID), and some results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, B; Keall, P; Holloway, L
Purpose: MRI guided radiation therapy (MRIgRT) is a rapidly growing field; however, Linac operation in MRI fringe fields represents an ongoing challenge. We have previously shown in-silico that Linacs could be redesigned to function in the in-line orientation with no magnetic shielding by adopting an RF-gun configuration. Other authors have also published insilico studies of Linac operation in magnetic fields; however to date no experimental validation data is published. This work details the design, construction, and installation of an experimental beam line to validate our in-silico results. Methods: An RF-gun comprising 1.5 accelerating cells and capable of generating electron energiesmore » up to 3.2MeV is used. The experimental apparatus was designed to monitor both beam current (toroid current monitor), spot size (two phosphor screens with viewports), and generate peak magnetic fields of at least 1000G (three variable current electromagnetic coils). Thermal FEM simulations were developed to ensure coil temperature remained within 100degC. Other design considerations included beam disposal, vacuum maintenance, radiation shielding, earthquake safety, and machine protection interlocks. Results: The beam line has been designed, built, and installed in a radiation shielded bunker. Water cooling, power supplies, thermo-couples, cameras, and radiation shielding have been successfully connected and tested. Interlock testing, vacuum processing, and RF processing have been successfully completed. The first beam on is expected within weeks. The coil heating simulations show that with care, peak fields of up to 1200G (320G at cathode) can be produced using 40A current, which is well within the fields expected for MRI-Linac systems. The maximum coil temperature at this current was 84degC after 6 minutes. Conclusion: An experimental beam line has been constructed and installed at SLAC in order to experimentally characterise RF gun performance in in-line magnetic fields, validate in-silico design work, and provide the first published experimental data relating to accelerator functionality for MRIgRT.« less
Impact of Process Protocol Design on Virtual Team Effectiveness
ERIC Educational Resources Information Center
Cordes, Christofer Sean
2013-01-01
This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…
Making Mentoring Stick: A Case Study
ERIC Educational Resources Information Center
Karallis, Takis; Sandelands, Eric
2009-01-01
Purpose: This paper seeks to provide a case study of the mentoring process within Kentz Engineers & Constructors. Design/methodology/approach: The paper reflects the experiences of those leading the mentoring process within Kentz with insights extracted from a process of action, reflection and live experimentation. Findings: The paper…
Biochemical Process Development and Integration | Bioenergy | NREL
Process Development We develop and scale fermentation processes that produce fuels and chemicals from guide experimental designs. Our newly updated fermentation laboratory houses 38 bench-scale fermentors current projects cover the fermentation spectrum including anaerobic, micro-aerobic, aerobic, and gas-to
The effects of DRIE operational parameters on vertically aligned micropillar arrays
NASA Astrophysics Data System (ADS)
Miller, Kane; Li, Mingxiao; Walsh, Kevin M.; Fu, Xiao-An
2013-03-01
Vertically aligned silicon micropillar arrays have been created by deep reactive ion etching (DRIE) and used for a number of microfabricated devices including microfluidic devices, micropreconcentrators and photovoltaic cells. This paper delineates an experimental design performed on the Bosch process of DRIE of micropillar arrays. The arrays are fabricated with direct-write optical lithography without photomask, and the effects of DRIE process parameters, including etch cycle time, passivation cycle time, platen power and coil power on profile angle, scallop depth and scallop peak-to-peak distance are studied by statistical design of experiments. Scanning electron microscope images are used for measuring the resultant profile angles and characterizing the scalloping effect on the pillar sidewalls. The experimental results indicate the effects of the determining factors, etch cycle time, passivation cycle time and platen power, on the micropillar profile angles and scallop depths. An optimized DRIE process recipe for creating nearly 90° and smooth surface (invisible scalloping) has been obtained as a result of the statistical design of experiments.
Thomas, Philipp; Rammsayer, Thomas; Schweizer, Karl; Troche, Stefan
2015-01-01
Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken together, these two latent variables explained the same portion of variance of Gf as a single latent variable obtained by traditional CFA (β = .65) indicating that traditional CFA causes an overestimation of the effective relationship between WMC and Gf. Thus, fixed-links modeling provides a feasible method for a more valid investigation of the functional relationship between specific constructs.
Research on an autonomous vision-guided helicopter
NASA Technical Reports Server (NTRS)
Amidi, Omead; Mesaki, Yuji; Kanade, Takeo
1994-01-01
Integration of computer vision with on-board sensors to autonomously fly helicopters was researched. The key components developed were custom designed vision processing hardware and an indoor testbed. The custom designed hardware provided flexible integration of on-board sensors with real-time image processing resulting in a significant improvement in vision-based state estimation. The indoor testbed provided convenient calibrated experimentation in constructing real autonomous systems.
Phenomena induced by charged particle beams. [experimental design for Spacelab
NASA Technical Reports Server (NTRS)
Beghin, C.
1981-01-01
The injection of energetic particles along the Earth's magnetic field lines is a possible remote sensing method for measuring the electric fields parallel to the magnetic field with good time resolution over the entire magnetic field. Neutralization processes, return-current effects, dynamics of the beams, triggered instabilities, and waves must be investigated before the fundamental question about proper experimental conditions, such as energy, intensity and divergence of the beams, pitch-angle injection, ion species, proper probes and detectors and their location, and rendezvous conditions, can be resolved. An experiment designed to provide a better understanding of these special physical processes and to provide some answers to questions concerning beam injection techniques is described.
Execution Of Systems Integration Principles During Systems Engineering Design
2016-09-01
This thesis discusses integration failures observed by DOD and non - DOD systems as, inadequate stakeholder analysis, incomplete problem space and design ... design , development, test and deployment of a system. A lifecycle structure consists of phases within a methodology or process model. There are many...investigate design decisions without the need to commit to physical forms; “ experimental investigation using a model yields design or operational
Large - scale Rectangular Ruler Automated Verification Device
NASA Astrophysics Data System (ADS)
Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie
2018-03-01
This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.
Chanona, J; Ribes, J; Seco, A; Ferrer, J
2006-01-01
This paper presents a model-knowledge based algorithm for optimising the primary sludge fermentation process design and operation. This is a recently used method to obtain the volatile fatty acids (VFA), needed to improve biological nutrient removal processes, directly from the raw wastewater. The proposed algorithm consists in a heuristic reasoning algorithm based on the expert knowledge of the process. Only effluent VFA and the sludge blanket height (SBH) have to be set as design criteria, and the optimisation algorithm obtains the minimum return sludge and waste sludge flow rates which fulfil those design criteria. A pilot plant fed with municipal raw wastewater was operated in order to obtain experimental results supporting the developed algorithm groundwork. The experimental results indicate that when SBH was increased, higher solids retention time was obtained in the settler and VFA production increased. Higher recirculation flow-rates resulted in higher VFA production too. Finally, the developed algorithm has been tested by simulating different design conditions with very good results. It has been able to find the optimal operation conditions in all cases on which preset design conditions could be achieved. Furthermore, this is a general algorithm that can be applied to any fermentation-elutriation scheme with or without fermentation reactor.
Experimental verification of an indefinite causal order
Rubino, Giulia; Rozema, Lee A.; Feix, Adrien; Araújo, Mateus; Zeuner, Jonas M.; Procopio, Lorenzo M.; Brukner, Časlav; Walther, Philip
2017-01-01
Investigating the role of causal order in quantum mechanics has recently revealed that the causal relations of events may not be a priori well defined in quantum theory. Although this has triggered a growing interest on the theoretical side, creating processes without a causal order is an experimental task. We report the first decisive demonstration of a process with an indefinite causal order. To do this, we quantify how incompatible our setup is with a definite causal order by measuring a “causal witness.” This mathematical object incorporates a series of measurements that are designed to yield a certain outcome only if the process under examination is not consistent with any well-defined causal order. In our experiment, we perform a measurement in a superposition of causal orders—without destroying the coherence—to acquire information both inside and outside of a “causally nonordered process.” Using this information, we experimentally determine a causal witness, demonstrating by almost 7 SDs that the experimentally implemented process does not have a definite causal order. PMID:28378018
The Rocket Equation Improvement under ICF Implosion Experiment
NASA Astrophysics Data System (ADS)
Wang, Yanbin; Zheng, Zhijian
2013-10-01
The ICF explosion process has been studied in details. The rocket equation has been improved in explosive process by introducing the pressure parameter of fuel. Some methods could be drawn by the improved rocket equation. And the methods could be used to improve ICF target design, driving pulse design and experimental design. The First is to increase ablation pressure. The second is to decrease pressure of fuel. The third is to use larger diameter of target sphere. And the forth is to a shorten driving pulse.
Experimental system design for the integration of trapped-ion and superconducting qubit systems
NASA Astrophysics Data System (ADS)
De Motte, D.; Grounds, A. R.; Rehák, M.; Rodriguez Blanco, A.; Lekitsch, B.; Giri, G. S.; Neilinger, P.; Oelsner, G.; Il'ichev, E.; Grajcar, M.; Hensinger, W. K.
2016-12-01
We present a design for the experimental integration of ion trapping and superconducting qubit systems as a step towards the realization of a quantum hybrid system. The scheme addresses two key difficulties in realizing such a system: a combined microfabricated ion trap and superconducting qubit architecture, and the experimental infrastructure to facilitate both technologies. Developing upon work by Kielpinski et al. (Phys Rev Lett 108(13):130504, 2012. doi: 10.1103/PhysRevLett.108.130504), we describe the design, simulation and fabrication process for a microfabricated ion trap capable of coupling an ion to a superconducting microwave LC circuit with a coupling strength in the tens of kHz. We also describe existing difficulties in combining the experimental infrastructure of an ion trapping set-up into a dilution refrigerator with superconducting qubits and present solutions that can be immediately implemented using current technology.
An analytical method for designing low noise helicopter transmissions
NASA Technical Reports Server (NTRS)
Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.
1978-01-01
The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.
Zhang, Xiaoming; Liu, Chang; Chen, Jinxiang; Zhang, Jiandong; Gu, Yueyan; Zhao, Yong
2016-12-01
The influence mechanism of processing holes on the flexural properties of fully integrated honeycomb plates (FIHPs) was analyzed using the finite element method (FEM), and the results were compared with experimental data, yielding the following findings: 1) Processing holes under tensile stress have a significant impact on the mechanical properties of FIHPs, which is particularly obvious when initial imperfections are formed during sample preparation. 2) A proposed design technique based on changing the shape of the processing holes from circular to elliptical effectively reduces the stress concentration when such holes must exist in skin or components under tension, and this method motivates a design concept for experimental tests of FIHPs bearing dynamic or fatigue loads. 3) The flexural failure modes of FIHPs were confirmed via FEM analysis, and the mechanism by which trabeculae in FIHPs can effectively prevent cracks from emerging and cause cracks to develop along certain paths was ascertained. Therefore, this paper provides a theoretical basis for the design of processing holes in bionic honeycomb plates and other similar components in practical engineering applications. Copyright © 2016 Elsevier B.V. All rights reserved.
Technique for experimental determination of radiation interchange factors in solar wavelengths
NASA Technical Reports Server (NTRS)
Bobco, R. P.; Nolte, L. J.; Wensley, J. R.
1971-01-01
Process obtains solar heating data which support analytical design. Process yields quantitative information on local solar exposure of models which are geometrically and reflectively similar to prototypes under study. Models are tested in a shirtsleeve environment.
NASA Astrophysics Data System (ADS)
Patole, Pralhad B.; Kulkarni, Vivek V.
2018-06-01
This paper presents an investigation into the minimum quantity lubrication mode with nano fluid during turning of alloy steel AISI 4340 work piece material with the objective of experimental model in order to predict surface roughness and cutting force and analyze effect of process parameters on machinability. Full factorial design matrix was used for experimental plan. According to design of experiment surface roughness and cutting force were measured. The relationship between the response variables and the process parameters is determined through the response surface methodology, using a quadratic regression model. Results show how much surface roughness is mainly influenced by feed rate and cutting speed. The depth of cut exhibits maximum influence on cutting force components as compared to the feed rate and cutting speed. The values predicted from the model and experimental values are very close to each other.
Enzymatic catalysis treatment method of meat industry wastewater using lacasse.
Thirugnanasambandham, K; Sivakumar, V
2015-01-01
The process of meat industry produces in a large amount of wastewater that contains high levels of colour and chemical oxygen demand (COD). So they must be pretreated before their discharge into the ecological system. In this paper, enzymatic catalysis (EC) was adopted to treat the meat wastewater. Box-Behnken design (BBD), an experimental design for response surface methodology (RSM), was used to create a set of 29 experimental runs needed for optimizing of the operating conditions. Quadratic regression models with estimated coefficients were developed to describe the colour and COD removals. The experimental results show that EC could effectively reduce colour (95 %) and COD (86 %) at the optimum conditions of enzyme dose of 110 U/L, incubation time of 100 min, pH of 7 and temperature of 40 °C. RSM could be effectively adopted to optimize the operating multifactors in complex EC process.
Reform of experimental teaching based on quality cultivation
NASA Astrophysics Data System (ADS)
Wang, Wei; Yan, Xingwei; Liu, Wei; Yao, Tianfu; Shi, Jianhua; Lei, Bing; Hu, Haojun
2017-08-01
Experimental teaching plays an import part in quality education which devotes to cultivating students with innovative spirit, strong technological talents and practical ability. However, in the traditional experimental teaching mode, the experiments are treated as a vassal or supplementary mean of theoretical teaching, and students prefer focus on theory to practice. Therefore, the traditional experimental teaching mode is difficult to meet the requirements of quality education. To address this issue, the reform of experimental teaching is introduced in this paper taking the photoelectric detector experiment as the example. The new experimental teaching mode is designed from such aspects as experimental content, teaching method and experimental evaluation. With the purpose of cultivating students' practical ability, two different-level experimental content is designed. Not only the basic experiments used to verify the theory are set to consolidate the students' learned theoretical knowledge, but also comprehensive experiments are designed to encourage the students to apply their learned knowledge to solve practical problems. In the teaching process, heuristic teaching thought is adopt and the traditional `teacher-centered' teaching form is replaced by `student-centered' form, which aims to encourage students to design the experimental systems by their own with the teacher's guidance. In addition to depending on stimulating the students' interest of science research, experimental evaluation is necessary to urge students to complete the experiments efficiently. Multifaceted evaluation method is proposed to test the students' mastery of theoretical knowledge, practice ability, troubleshooting and problem solving skills, and innovation capability comprehensively. Practices demonstrated the satisfying effect of our experimental teaching mode.
NASA Technical Reports Server (NTRS)
Vickers, John
2015-01-01
The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes
Anisotropy of Photopolymer Parts Made by Digital Light Processing
Monzón, Mario; Ortega, Zaida; Hernández, Alba; Paz, Rubén; Ortega, Fernando
2017-01-01
Digital light processing (DLP) is an accurate additive manufacturing (AM) technology suitable for producing micro-parts by photopolymerization. As most AM technologies, anisotropy of parts made by DLP is a key issue to deal with, taking into account that several operational factors modify this characteristic. Design for this technology and photopolymers becomes a challenge because the manufacturing process and post-processing strongly influence the mechanical properties of the part. This paper shows experimental work to demonstrate the particular behavior of parts made using DLP. Being different to any other AM technology, rules for design need to be adapted. Influence of build direction and post-curing process on final mechanical properties and anisotropy are reported and justified based on experimental data and theoretical simulation of bi-material parts formed by fully-cured resin and partially-cured resin. Three photopolymers were tested under different working conditions, concluding that post-curing can, in some cases, correct the anisotropy, mainly depending on the nature of photopolymer. PMID:28772426
Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang
2016-01-01
It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894
Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen
2015-02-28
The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Ladd Effio, Christopher; Hahn, Tobias; Seiler, Julia; Oelmeier, Stefan A; Asen, Iris; Silberer, Christine; Villain, Louis; Hubbuch, Jürgen
2016-01-15
Recombinant protein-based virus-like particles (VLPs) are steadily gaining in importance as innovative vaccines against cancer and infectious diseases. Multiple VLPs are currently evaluated in clinical phases requiring a straightforward and rational process design. To date, there is no generic platform process available for the purification of VLPs. In order to accelerate and simplify VLP downstream processing, there is a demand for novel development approaches, technologies, and purification tools. Membrane adsorbers have been identified as promising stationary phases for the processing of bionanoparticles due to their large pore sizes. In this work, we present the potential of two strategies for designing VLP processes following the basic tenet of 'quality by design': High-throughput experimentation and process modeling of an anion-exchange membrane capture step. Automated membrane screenings allowed the identification of optimal VLP binding conditions yielding a dynamic binding capacity of 5.7 mg/mL for human B19 parvovirus-like particles derived from Spodoptera frugiperda Sf9 insect cells. A mechanistic approach was implemented for radial ion-exchange membrane chromatography using the lumped-rate model and stoichiometric displacement model for the in silico optimization of a VLP capture step. For the first time, process modeling enabled the in silico design of a selective, robust and scalable process with minimal experimental effort for a complex VLP feedstock. The optimized anion-exchange membrane chromatography process resulted in a protein purity of 81.5%, a DNA clearance of 99.2%, and a VLP recovery of 59%. Copyright © 2015 Elsevier B.V. All rights reserved.
[Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].
Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin
2017-07-01
In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.
Scientific, statistical, practical, and regulatory considerations in design space development.
Debevec, Veronika; Srčič, Stanko; Horvat, Matej
2018-03-01
The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.
An Example of Process Evaluation.
ERIC Educational Resources Information Center
Karl, Marion C.
The inappropriateness of standard experimental research design, which can stifle innovations, is discussed in connection with the problems of designing practical techniques for evaluating a Title III curriculum development project. The project, involving 12 school districts and 2,500 students, teaches concept understanding, critical thinking, and…
Influence of Punch Geometry on Process Parameters in Cold Backward Extrusion
NASA Astrophysics Data System (ADS)
Plančak, M.; Barišić, B.; Car, Z.; Movrin, D.
2011-01-01
In cold extrusion of steel tools make direct contact with the metal to be extruded. Those tools are exposed to high contact stresses which, in certain cases, may be limiting factors in applying this technology. The present paper was bound to the influence of punch head design on radial stress at the container wall in the process of cold backward extrusion. Five different punch head geometries were investigated. Radial stress on the container wall was measured by pin load cell technique. Special tooling for the experimental investigation was designed and made. Process has been analyzed also by FE method. 2D models of tools were obtained by UGS NX and for FE analysis Simufact Forming GP software was used. Obtained results (experimental and obtained by FE) were compared and analyzed. Optimal punch head geometry has been suggested.
Plasma contactor research, 1989
NASA Technical Reports Server (NTRS)
Williams, John D.
1990-01-01
The characteristics of double layers observed by researchers investigating magnetospheric phenomena are contrasted to those observed in plasma contacting experiments. Experiments in the electron collection mode of the plasma contacting process were performed and the results confirm a simple model of this process for current levels ranging to 3 A. Experimental results were also obtained in a study of the process of electron emission from a hollow cathode plasma contactor. High energy ions are observed coming from the cathode in addition to the electrons and a phenomenological model that suggests a mechanism by which this could occur is presented. Experimental results showing the effects of the design parameters of the ambient plasma simulator on the plasma potential, electron temperature, electron density and plasma noise levels induced in plasma contacting experiments are presented. A preferred simulator design is selected on the basis of these results.
Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S
2013-04-05
This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost savings obtained by 8 proof-of-concept batches would be sufficient to pay back the investment cost of the pilot-scale semi-continuous chromatography system. Copyright © 2013 Elsevier B.V. All rights reserved.
Teaching fractional factorial experiments via course delegate designed experiments.
Coleman, S; Antony, J
1999-01-01
Industrial experiments are fundamental in enhancing the understanding and knowledge of a process and product behavior. Designed industrial experiments assist people in understanding, investigating, and improving their processes. The purpose of a designed experiment is to understand which factors might influence the process output and then to determine those factor settings that optimize the process output. Teaching "design of experiments" using textbook examples does not fully shed light on how to identify and formulate the problem, identify factors, and determine the performance of the physical experiment. Presented here is an example of how to teach fractional factorial experiments in a course on designed experiments. Also presented is a practical, hands-on experiment that has been found to be extremely successful in instilling confidence and motivation in course delegates. The experiment provides a great stimulus to the delegates for the application of experimental design in their own work environment.
NASA Technical Reports Server (NTRS)
1981-01-01
Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.
The Design and Analysis of Transposon-Insertion Sequencing Experiments
Chao, Michael C.; Abel, Sören; Davis, Brigid M.; Waldor, Matthew K.
2016-01-01
Preface Transposon-insertion sequencing (TIS) is a powerful approach that can be widely applied to genome-wide definition of loci that are required for growth in diverse conditions. However, experimental design choices and stochastic biological processes can heavily influence the results of TIS experiments and affect downstream statistical analysis. Here, we discuss TIS experimental parameters and how these factors relate to the benefits and limitations of the various statistical frameworks that can be applied to computational analysis of TIS data. PMID:26775926
Definition of smolder experiments for Spacelab
NASA Technical Reports Server (NTRS)
Summerfield, M.; Messina, N. A.; Ingram, L. S.
1979-01-01
The feasibility of conducting experiments in space on smoldering combustion was studied to conceptually design specific smoldering experiments to be conducted in the Shuttle/Spacelab System. Design information for identified experiment critical components is provided. The analytical and experimental basis for conducting research on smoldering phenomena in space was established. Physical descriptions of the various competing processes pertaining to smoldering combustion were identified. The need for space research was defined based on limitations of existing knowledge and limitations of ground-based reduced-gravity experimental facilities.
Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.
Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà
2017-10-01
Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.
Supercritical water oxidation of products of human metabolism
NASA Technical Reports Server (NTRS)
Tester, Jefferson W.; Orge A. achelling, Richard K. ADTHOMASSON; Orge A. achelling, Richard K. ADTHOMASSON
1986-01-01
Although the efficient destruction of organic material was demonstrated in the supercritical water oxidation process, the reaction kinetics and mechanisms are unknown. The kinetics and mechanisms of carbon monoxide and ammonia oxidation in and reaction with supercritical water were studied experimentally. Experimental oxidation of urine and feces in a microprocessor controlled system was performed. A minaturized supercritical water oxidation process for space applications was design, including preliminary mass and energy balances, power, space and weight requirements.
Numerical and experimental modelling of the radial compressor stage
NASA Astrophysics Data System (ADS)
Syka, Tomáš; Matas, Richard; LuÅáček, Ondřej
2016-06-01
This article deals with the description of the numerical and experimental model of the new compressor stage designed for process centrifugal compressors. It's the first member of the new stages family developed to achieve the state of the art thermodynamic parameters. This stage (named RTK01) is designed for high flow coefficient with 3D shaped impeller blades. Some interesting findings were gained during its development. The article is focused mainly on some interesting aspects of the development methodology and numerical simulations improvement, not on the specific stage properties. Conditions and experimental equipment, measured results and their comparison with ANSYS CFX and NUMECA FINE/Turbo CFD simulations are described.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...
NASA Astrophysics Data System (ADS)
Thubagere, Anupama J.; Thachuk, Chris; Berleant, Joseph; Johnson, Robert F.; Ardelean, Diana A.; Cherry, Kevin M.; Qian, Lulu
2017-02-01
Biochemical circuits made of rationally designed DNA molecules are proofs of concept for embedding control within complex molecular environments. They hold promise for transforming the current technologies in chemistry, biology, medicine and material science by introducing programmable and responsive behaviour to diverse molecular systems. As the transformative power of a technology depends on its accessibility, two main challenges are an automated design process and simple experimental procedures. Here we demonstrate the use of circuit design software, combined with the use of unpurified strands and simplified experimental procedures, for creating a complex DNA strand displacement circuit that consists of 78 distinct species. We develop a systematic procedure for overcoming the challenges involved in using unpurified DNA strands. We also develop a model that takes synthesis errors into consideration and semi-quantitatively reproduces the experimental data. Our methods now enable even novice researchers to successfully design and construct complex DNA strand displacement circuits.
NASA Technical Reports Server (NTRS)
Kuo, K. K.; Hsieh, W. H.; Cheung, F. B.; Yang, A. S.; Brown, J. J.; Woodward, R. D.; Kline, M. C.; Burch, R. L.
1992-01-01
The objective was to achieve a better understanding of the combustion processes of liquid oxygen and gaseous hydrogen under broad range of pressure covering subcritical, critical, and supercritical conditions. The scope of the experimental work falls into the following areas: (1) design of the overall experimental setup; (2) modification of an existing windowed high pressure chamber; (3) design of the LOX feeding system; (4) provision of the safety features in the test rig design; (5) LOX cleanliness requirements; (6) cold shock testing; (7) implementation of data acquisition systems; (8) preliminary tests for system checkout; (9) modification of LOX feeding system; and (10) evaporation tests. Progress in each area is discussed.
Morschett, Holger; Freier, Lars; Rohde, Jannis; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco
2017-01-01
Even though microalgae-derived biodiesel has regained interest within the last decade, industrial production is still challenging for economic reasons. Besides reactor design, as well as value chain and strain engineering, laborious and slow early-stage parameter optimization represents a major drawback. The present study introduces a framework for the accelerated development of phototrophic bioprocesses. A state-of-the-art micro-photobioreactor supported by a liquid-handling robot for automated medium preparation and product quantification was used. To take full advantage of the technology's experimental capacity, Kriging-assisted experimental design was integrated to enable highly efficient execution of screening applications. The resulting platform was used for medium optimization of a lipid production process using Chlorella vulgaris toward maximum volumetric productivity. Within only four experimental rounds, lipid production was increased approximately threefold to 212 ± 11 mg L -1 d -1 . Besides nitrogen availability as a key parameter, magnesium, calcium and various trace elements were shown to be of crucial importance. Here, synergistic multi-parameter interactions as revealed by the experimental design introduced significant further optimization potential. The integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design proved to be a fruitful tool for the accelerated development of phototrophic bioprocesses. By means of the proposed technology, the targeted optimization task was conducted in a very timely and material-efficient manner.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Petrosyan, V. G.; Hovakimyan, T. H.; Yeghoyan, E. A.; Hovhannisyan, H. T.; Mayilyan, D. G.; Petrosyan, A. P.
2017-01-01
This paper is dedicated to the creation of a facility for the experimental study of a phenomenon of background acoustic emission (AE), which is detected in the main circulation loop (MCL) of WWER power units. The analysis of the operating principle and the design of a primary feed-and-blow down system (FB) deaerator of NPP as the most likely source of continuous acoustic emission is carried out. The experimental facility for the systematic study of a phenomenon of continuous AE is developed. A physical model of a thermal deaerator is designed and constructed. A thermal monitoring system is introduced. An automatic system providing acoustic signal registration in a low frequency (0.03-30 kHz) and high frequency (30-300 kHz) bands and study of its spectral characteristics is designed. Special software for recording and processing of digitized electrical sensor signals is developed. A separate and independent principle of study of the most probable processes responsible for the generation of acoustic emission signals in the deaerator is applied. Trial series of experiments and prechecks of acoustic signals in different modes of the deaerator model are conducted. Compliance of basic technological parameters with operating range of the real deaerator was provided. It is shown that the acoustic signal time-intensity curve has several typical regions. The pilot research showed an impact of various processes that come about during the operation of the deaerator physical model on the intensity of the AE signal. The experimental results suggest that the main sources of generation of the AE signals are the processes of steam condensation, turbulent flow of gas-vapor medium, and water boiling.
NASA Astrophysics Data System (ADS)
Khan, M. M. A.; Romoli, L.; Fiaschi, M.; Dini, G.; Sarri, F.
2011-02-01
This paper presents an experimental design approach to process parameter optimization for the laser welding of martensitic AISI 416 and AISI 440FSe stainless steels in a constrained overlap configuration in which outer shell was 0.55 mm thick. To determine the optimal laser-welding parameters, a set of mathematical models were developed relating welding parameters to each of the weld characteristics. These were validated both statistically and experimentally. The quality criteria set for the weld to determine optimal parameters were the minimization of weld width and the maximization of weld penetration depth, resistance length and shearing force. Laser power and welding speed in the range 855-930 W and 4.50-4.65 m/min, respectively, with a fiber diameter of 300 μm were identified as the optimal set of process parameters. However, the laser power and welding speed can be reduced to 800-840 W and increased to 4.75-5.37 m/min, respectively, to obtain stronger and better welds.
Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan
2017-01-01
Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.
Experimental design data for the biosynthesis of citric acid using Central Composite Design method.
Kola, Anand Kishore; Mekala, Mallaiah; Goli, Venkat Reddy
2017-06-01
In the present investigation, we report that statistical design and optimization of significant variables for the microbial production of citric acid from sucrose in presence of filamentous fungi A. niger NCIM 705. Various combinations of experiments were designed with Central Composite Design (CCD) of Response Surface Methodology (RSM) for the production of citric acid as a function of six variables. The variables are; initial sucrose concentration, initial pH of medium, fermentation temperature, incubation time, stirrer rotational speed, and oxygen flow rate. From experimental data, a statistical model for this process has been developed. The optimum conditions reported in the present article are initial concentration of sucrose of 163.6 g/L, initial pH of medium 5.26, stirrer rotational speed of 247.78 rpm, incubation time of 8.18 days, fermentation temperature of 30.06 °C and flow rate of oxygen of 1.35 lpm. Under optimum conditions the predicted maximum citric acid is 86.42 g/L. The experimental validation carried out under the optimal values and reported citric acid to be 82.0 g/L. The model is able to represent the experimental data and the agreement between the model and experimental data is good.
Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat
2013-01-01
Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.
Conceptual design of ACB-CP for ITER cryogenic system
NASA Astrophysics Data System (ADS)
Jiang, Yongcheng; Xiong, Lianyou; Peng, Nan; Tang, Jiancheng; Liu, Liqiang; Zhang, Liang
2012-06-01
ACB-CP (Auxiliary Cold Box for Cryopumps) is used to supply the cryopumps system with necessary cryogen in ITER (International Thermonuclear Experimental Reactor) cryogenic distribution system. The conceptual design of ACB-CP contains thermo-hydraulic analysis, 3D structure design and strength checking. Through the thermohydraulic analysis, the main specifications of process valves, pressure safety valves, pipes, heat exchangers can be decided. During the 3D structure design process, vacuum requirement, adiabatic requirement, assembly constraints and maintenance requirement have been considered to arrange the pipes, valves and other components. The strength checking has been performed to crosscheck if the 3D design meets the strength requirements for the ACB-CP.
ERIC Educational Resources Information Center
Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah
2013-01-01
The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…
Impact of care pathways for in-hospital management of COPD exacerbation: a systematic review.
Lodewijckx, C; Sermeus, W; Panella, M; Deneckere, S; Leigheb, F; Decramer, M; Vanhaecht, K
2011-11-01
In-hospital management of COPD exacerbation is suboptimal, and outcomes are poor. Care pathways are a possible strategy for optimizing care processes and outcomes. The aim of the literature review was to explore characteristics of existing care pathways for in-hospital management of COPD exacerbations and to address their impact on performance of care processes, clinical outcomes, and team functioning. A literature search was conducted for articles published between 1990 and 2010 in the electronic databases of Medline, CINAHL, EMBASE, and Cochrane Library. Main inclusion criteria were (I) patients hospitalized for a COPD exacerbation; (II) implementation and evaluation of a care pathway; (III) report of original research, including experimental and quasi experimental designs, variance analysis, and interviews of professionals and patients about their perception on pathway effectiveness. Four studies with a quasi experimental design were included. Three studies used a pre-post test design; the fourth study was a non randomized controlled trial comparing an experimental group where patients were treated according to a care pathway with a control group where usual care was provided. The four studied care pathways were multidisciplinary structured care plans, outlining time-specific clinical interventions and responsibilities by discipline. Statistic analyses were rarely performed, and the trials used very divergent indicators to evaluate the impact of the care pathways. The studies described positive effects on blood sampling, daily weight measurement, arterial blood gas measurement, referral to rehabilitation, feelings of anxiety, length of stay, readmission, and in-hospital mortality. Research on COPD care pathways is very limited. The studies described few positive effects of the care pathways on diagnostic processes and on clinical outcomes. Though due to limited statistical analysis and weak design of the studies, the internal validity of results is limited. Therefore, based on these studies the impact of care pathways on COPD exacerbation is inconclusive. These findings indicate the need for properly designed research like a cluster randomized controlled trial to evaluate the impact of COPD care pathways on performance of care processes, clinical outcomes, and teamwork. Copyright © 2011 Elsevier Ltd. All rights reserved.
Roush, W B; Boykin, D; Branton, S L
2004-08-01
A mixture experiment, a variant of response surface methodology, was designed to determine the proportion of time to feed broiler starter (23% protein), grower (20% protein), and finisher (18% protein) diets to optimize production and processing variables based on a total production time of 48 d. Mixture designs are useful for proportion problems where the components of the experiment (i.e., length of time the diets were fed) add up to a unity (48 d). The experiment was conducted with day-old male Ross x Ross broiler chicks. The birds were placed 50 birds per pen in each of 60 pens. The experimental design was a 10-point augmented simplex-centroid (ASC) design with 6 replicates of each point. Each design point represented the portion(s) of the 48 d that each of the diets was fed. Formulation of the diets was based on NRC standards. At 49 d, each pen of birds was evaluated for production data including BW, feed conversion, and cost of feed consumed. Then, 6 birds were randomly selected from each pen for processing data. Processing variables included live weight, hot carcass weight, dressing percentage, fat pad percentage, and breast yield (pectoralis major and pectoralis minor weights). Production and processing data were fit to simplex regression models. Model terms determined not to be significant (P > 0.05) were removed. The models were found to be statistically adequate for analysis of the response surfaces. A compromise solution was calculated based on optimal constraints designated for the production and processing data. The results indicated that broilers fed a starter and finisher diet for 30 and 18 d, respectively, would meet the production and processing constraints. Trace plots showed that the production and processing variables were not very sensitive to the grower diet.
Diagnostic techniques in deflagration and detonation studies.
Proud, William G; Williamson, David M; Field, John E; Walley, Stephen M
2015-12-01
Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour.
NASA Astrophysics Data System (ADS)
Novak, Joseph
Optical biological sensors are widely used in the fields of medical testing, water treatment and safety, gene identification, and many others due to advances in nanofabrication technology. This work focuses on the design of fiber-coupled Mach-Zehnder Interferometer (MZI) based biosensors fabricated on silicon-on-insulator (SOI) wafer. Silicon waveguide sensors are designed with multimode and single-mode dimensions. Input coupling efficiency is investigated by design of various taper structures. Integration processing and packaging is performed for fiber attachment and enhancement of input coupling efficiency. Optical guided-wave sensors rely on single-mode operation to extract an induced phase-shift from the output signal. A silicon waveguide MZI sensor designed and fabricated for both multimode and single-mode dimensions. Sensitivity of the sensors is analyzed for waveguide dimensions and materials. An s-bend structure is designed for the multimode waveguide to eliminate higher-order mode power as an alternative to single-mode confinement. Single-mode confinement is experimentally demonstrated through near field imaging of waveguide output. Y-junctions are designed for 3dB power splitting to the MZI arms and for power recombination after sensing to utilize the interferometric function of the MZI. Ultra-short 10microm taper structures with curved geometries are designed to improve insertion loss from fiber-to-chip without significantly increasing device area and show potential for applications requiring misalignment tolerance. An novel v-groove process is developed for self-aligned integration of fiber grooves for attachment to sensor chips. Thermal oxidation at temperatures from 1050-1150°C during groove processing creates an SiO2 layer on the waveguide end facet to protect the waveguide facet during integration etch processing without additional e-beam lithography processing. Experimental results show improvement of insertion loss compared to dicing preparation and Focused Ion Beam methods using the thermal oxidation process.
A review of blood sample handling and pre-processing for metabolomics studies.
Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta
2017-09-01
Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... Indian scientists critically examined the techniques, the appropriateness of experimental design, the... are, even if true, legally insufficient to alter the decision, the Agency need not grant a hearing... objection fails to note that many of the findings cited in the experimental report were observed both in...
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Lewandowski, H. J.
2017-01-01
Laboratory courses represent a unique and potentially important component of the undergraduate physics curriculum, which can be designed to allow students to authentically engage with the process of experimental physics. Among other possible benefits, participation in these courses throughout the undergraduate physics curriculum presents an…
Revision Vodcast Influence on Assessment Scores and Study Processes in Secondary Physics
ERIC Educational Resources Information Center
Marencik, Joseph J.
2012-01-01
A quasi-experimental switching replications design with matched participants was employed to determine the influence of revision vodcasts, or video podcasts, on students' assessment scores and study processes in secondary physics. This study satisfied a need for quantitative results in the area of vodcast influence on students' learning processes.…
The TRIPSE: A Process-Oriented Exam for Large Undergraduate Classes
ERIC Educational Resources Information Center
Nastos, Stash; Rangachari, P. K.
2013-01-01
The TRIPSE (tri-partite problem solving exercise), a process-oriented exam that mimics the scientific process, was used previously in small classes (15-25). Provided limited data, students frame explanations and design experimental tests that they later revise with additional information. Our 6-year experience using it with larger numbers…
NASA Astrophysics Data System (ADS)
Blume, Theresa; Weiler, Markus; Angermann, Lisa; Beiter, Daniel; Hassler, Sibylle; Kaplan, Nils; Lieder, Ernestine; Sprenger, Matthias
2017-04-01
Sustainable water resources management needs to be based on sound process understanding. This is especially true in a changing world, where boundary conditions change and models calibrated to the status quo are no longer helpful. There is a general agreement in the hydrologic community that we are in need of a better process understanding and that one of the most promising ways to achieve this is by using nested experimental designs that cover a range of scales. In the here presented study we argue that while we might be able to investigate a certain process at a plot or hillslope in detail, the real power of advancing our understanding lies in site intercomparison and if possible knowledge transfer and generalization. The experimental design of the CAOS observatory is based on sensor clusters measuring ground-, soil and stream water, sap flow and climate variables in 45 hydrological functional units which were chosen from a matrix of site characteristics (geology, land use, hillslope aspect, and topographic positions). This design allows for site intercomparisons that are based on more than one member per class and thus does not only characterize between class differences but also attempts to identify within-class variability. These distributed plot scale investigations offer a large amount of information on plot scale processes and their variability in space and time (e.g. water storage dynamics and patterns, vertical flow processes and vadose zone transit times, transpiration dynamics and patterns). However, if we want to improve our understanding of runoff generation (and thus also of nutrient and contaminant transport and export to the stream) we need to also understand how these plots link up within hillslopes and how and when these hillslopes are connected to the stream. And certainly, this is again most helpful if we do not focus on single sites but attempt experimental designs that aim at intercomparison and generalization. At the same time, the investigation of hillslope-stream connectivity is extremely challenging due to the fact that there is a high 4-dimensional variability of the involved processes and most of them are hidden from view in the subsurface. To tackle this challenge we employed a number of different field methods ranging from hillslope scale irrigation and flow-through experiments, to in depth analyses of near stream piezometer responses and stream reach tracer experiments, and then moving on to the mesoscale catchment with network wide investigations of spatial patterns of stream temperature and electric conductivity as well as of the expansion and shrinkage of the network itself. In this presentation we will provide an overview of the rationale, approach, experimental design and ongoing work, the challenges we encountered and a synthesis of exemplary results.
NASA Astrophysics Data System (ADS)
Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.
2016-04-01
The transfer of liquid helium (LHe) into mobile dewars or transport vessels is a common and unavoidable process at LHe decant stations. During this transfer reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus generated helium gas needs to be collected and reliquefied which requires a huge amount of electrical energy. Therefore, the design of transfer lines used at LHe decant stations has been optimised to establish a LHe transfer with minor evaporation losses which increases the overall efficiency and capacity of LHe decant stations. This paper presents the experimental results achieved during the thermohydraulic optimisation of a flexible LHe transfer line. An extensive measurement campaign with a set of dedicated transfer lines equipped with pressure and temperature sensors led to unique experimental data of this specific transfer process. The experimental results cover the heat leak, the pressure drop, the transfer rate, the outlet quality, and the cool-down and warm-up behaviour of the examined transfer lines. Based on the obtained results the design of the considered flexible transfer line has been optimised, featuring reduced heat leak and pressure drop.
Web-Based Integrated Research Environment for Aerodynamic Analyses and Design
NASA Astrophysics Data System (ADS)
Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won
e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.
Evaluating the Process of Generating a Clinical Trial Protocol
Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.
2002-01-01
The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.
Research on the use of space resources
NASA Technical Reports Server (NTRS)
Carroll, W. F. (Editor)
1983-01-01
The second year of a multiyear research program on the processing and use of extraterrestrial resources is covered. The research tasks included: (1) silicate processing, (2) magma electrolysis, (3) vapor phase reduction, and (4) metals separation. Concomitant studies included: (1) energy systems, (2) transportation systems, (3) utilization analysis, and (4) resource exploration missions. Emphasis in fiscal year 1982 was placed on the magma electrolysis and vapor phase reduction processes (both analytical and experimental) for separation of oxygen and metals from lunar regolith. The early experimental work on magma electrolysis resulted in gram quantities of iron (mixed metals) and the identification of significant anode, cathode, and container problems. In the vapor phase reduction tasks a detailed analysis of various process concepts led to the selection of two specific processes designated as ""Vapor Separation'' and ""Selective Ionization.'' Experimental work was deferred to fiscal year 1983. In the Silicate Processing task a thermophysical model of the casting process was developed and used to study the effect of variations in material properties on the cooling behavior of lunar basalt.
Experimental analysis of Nd-YAG laser cutting of sheet materials - A review
NASA Astrophysics Data System (ADS)
Sharma, Amit; Yadava, Vinod
2018-01-01
Cutting of sheet material is considered as an important process due to its relevance among products of everyday life such as aircrafts, ships, cars, furniture etc. Among various sheet cutting processes (ASCPs), laser beam cutting is one of the most capable ASCP to create complex geometries with stringent design requirements in difficult-to-cut sheet materials. Based on the recent research work in the area of sheet cutting, it is found that the Nd-YAG laser is used for cutting of sheet material in general and reflective sheet material in particular. This paper reviews the experimental analysis of Nd-YAG laser cutting process, carried out to study the influence of laser cutting parameters on the process performance index. The significance of experimental modeling and different optimization approaches employed by various researchers has also been discussed in this study.
NASA Astrophysics Data System (ADS)
Kingswell, R.; Scott, K. T.; Wassell, L. L.
1993-06-01
The vacuum plasma spray (VPS) deposition of metal, ceramic, and cermet coatings has been investigated using designed statistical experiments. Processing conditions that were considered likely to have a significant influence on the melting characteristics of the precursor powders and hence deposition efficiency were incorporated into full and fractional factorial experimental designs. The processing of an alumina powder was very sensitive to variations in the deposition conditions, particularly the injection velocity of the powder into the plasma flame, the plasma gas composition, and the power supplied to the gun. Using a combination of full and fractional factorial experimental designs, it was possible to rapidly identify the important spraying variables and adjust these to produce a deposition efficiency approaching 80 percent. The deposition of a nickel-base alloy metal powder was less sensitive to processing conditions. Generally, however, a high degree of particle melting was achieved for a wide range of spray conditions. Preliminary experiments performed using a tungsten carbide/cobalt cermet powder indicated that spray efficiency was not sensitive to deposition conditions. However, microstructural analysis revealed considerable variations in the degree of tungsten carbide dissolution. The structure and properties of the optimized coatings produced in the factorial experiments are also discussed.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
Gronau, Greta; Krishnaji, Sreevidhya T.; Kinahan, Michelle E.; Giesa, Tristan; Wong, Joyce Y.; Kaplan, David L.; Buehler, Markus J.
2013-01-01
Tailored biomaterials with tunable functional properties are desirable for many applications ranging from drug delivery to regenerative medicine. To improve the predictability of biopolymer materials functionality, multiple design parameters need to be considered, along with appropriate models. In this article we review the state of the art of synthesis and processing related to the design of biopolymers, with an emphasis on the integration of bottom-up computational modeling in the design process. We consider three prominent examples of well-studied biopolymer materials – elastin, silk, and collagen – and assess their hierarchical structure, intriguing functional properties and categorize existing approaches to study these materials. We find that an integrated design approach in which both experiments and computational modeling are used has rarely been applied for these materials due to difficulties in relating insights gained on different length- and time-scales. In this context, multiscale engineering offers a powerful means to accelerate the biomaterials design process for the development of tailored materials that suit the needs posed by the various applications. The combined use of experimental and computational tools has a very broad applicability not only in the field of biopolymers, but can be exploited to tailor the properties of other polymers and composite materials in general. PMID:22938765
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Junhwan; Hwang, Sungui; Park, Kyihwan, E-mail: khpark@gist.ac.kr
To utilize a time-of-flight-based laser scanner as a distance measurement sensor, the measurable distance and accuracy are the most important performance parameters to consider. For these purposes, the optical system and electronic signal processing of the laser scanner should be optimally designed in order to reduce a distance error caused by the optical crosstalk and wide dynamic range input. Optical system design for removing optical crosstalk problem is proposed in this work. Intensity control is also considered to solve the problem of a phase-shift variation in the signal processing circuit caused by object reflectivity. The experimental results for optical systemmore » and signal processing design are performed using 3D measurements.« less
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems.
White, Andrew; Tolman, Malachi; Thames, Howard D; Withers, Hubert Rodney; Mason, Kathy A; Transtrum, Mark K
2016-12-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model's discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system-a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model.
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems
Tolman, Malachi; Thames, Howard D.; Mason, Kathy A.
2016-01-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model’s discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system–a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model. PMID:27923060
Aesthetic Pleasure versus Aesthetic Interest: The Two Routes to Aesthetic Liking
Graf, Laura K. M.; Landwehr, Jan R.
2017-01-01
Although existing research has established that aesthetic pleasure and aesthetic interest are two distinct positive aesthetic responses, empirical research on aesthetic preferences usually considers only aesthetic liking to capture participants’ aesthetic response. This causes some fundamental contradictions in the literature; some studies find a positive relationship between easy-to-process stimulus characteristics and aesthetic liking, while others suggest a negative relationship. The present research addresses these empirical contradictions by investigating the dual character of aesthetic liking as manifested in both the pleasure and interest components. Based on the Pleasure-Interest Model of Aesthetic Liking (PIA Model; Graf and Landwehr, 2015), two studies investigated the formation of pleasure and interest and their relationship with aesthetic liking responses. Using abstract art as the stimuli, Study 1 employed a 3 (stimulus fluency: low, medium, high) × 2 (processing style: automatic, controlled) × 2 (aesthetic response: pleasure, interest) experimental design to examine the processing dynamics responsible for experiencing aesthetic pleasure versus aesthetic interest. We find that the effect of stimulus fluency on pleasure is mediated by a gut-level fluency experience. Stimulus fluency and interest, by contrast, are related through a process of disfluency reduction, such that disfluent stimuli that grow more fluent due to processing efforts become interesting. The second study employed product designs (bikes, chairs, and lamps) as stimuli and a 2 (fluency: low, high) × 2 (processing style: automatic, controlled) × 3 (product type: bike, chair, lamp) experimental design to examine pleasure and interest as mediators of the relationship between stimulus fluency and design attractiveness. With respect to lamps and chairs, the results suggest that the effect of stimulus fluency on attractiveness is fully mediated by aesthetic pleasure, especially in the automatic processing style. Conversely, disfluent product designs can enhance design attractiveness judgments due to interest when a controlled processing style is adopted. PMID:28194119
Research on animation design of growing plant based on 3D MAX technology
NASA Astrophysics Data System (ADS)
Chen, Yineng; Fang, Kui; Bu, Weiqiong; Zhang, Xiaoling; Lei, Menglong
In view of virtual plant has practical demands on quality, image and degree of realism animation in growing process of plant, this thesis design the animation based on mechanism and regularity of plant growth, and propose the design method based on 3D MAX technology. After repeated analysis and testing, it is concluded that there are modeling, rendering, animation fabrication and other key technologies in the animation design process. Based on this, designers can subdivid the animation into seed germination animation, plant growth prophase animation, catagen animation, later animation and blossom animation. This paper compounds the animation of these five stages by VP window to realize the completed 3D animation. Experimental result shows that the animation can realized rapid, visual and realistic simulatation the plant growth process.
14 CFR 1240.102 - Definitions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Contributions Board. (d) Commercial quality refers to computer software that is not in an experimental or beta..., engineering or scientific concept, idea, design, process, or product. (h) Innovator means any person listed as..., machine, manufacture, design, or composition of matter, or any new and useful improvement thereof, or any...
NASA Technical Reports Server (NTRS)
Post, E. J.
1970-01-01
An experiment, designed to determine the difference between fields-magnetic and electric-surrounding a uniformly moving charge as contrasted with the fields surrounding an accelerated charge, is presented. A thought experiment is presented to illustrate the process.
An efficient planar accordion-shaped micromixer: from biochemical mixing to biological application
Cosentino, Armando; Madadi, Hojjat; Vergara, Paola; Vecchione, Raffaele; Causa, Filippo; Netti, Paolo Antonio
2015-01-01
Micromixers are the key component that allow lab-on-a-chip and micro total analysis systems to reach the correct level of mixing for any given process. This paper proposes a novel, simple, passive micromixer design characterized by a planar accordion-shape geometry. The geometrical characteristics of the presented design were analyzed numerically in the range of 0.01 < Re < 100 based on the micromixer performance. The performance of the most efficient design was experimentally investigated by means of fluorescence microscopy for a range of low diffusion coefficients, 10−12 < D < 10−11 m2/s. The micromixer structure was fabricated in a simple single-step process using maskless lithography and soft lithography. The experimental results showed a very good agreement with the predicted numerical results. This micromixer design including a single serpentine unit (1-SERP) displayed an efficiency higher than 90% (mixing length = 6.4 mm) creating a pressure drop of about 500 Pa at Re = 0.1 and 60 kPa at Re = 10. A mixing efficiency of almost 100% was readily reached when three serpentine units were included (3-SERP). Finally, the potential diagnostic value of the presented microdevice was validated experimentally for Red Blood Cell (RBC) lysis. PMID:26658848
An efficient planar accordion-shaped micromixer: from biochemical mixing to biological application
NASA Astrophysics Data System (ADS)
Cosentino, Armando; Madadi, Hojjat; Vergara, Paola; Vecchione, Raffaele; Causa, Filippo; Netti, Paolo Antonio
2015-12-01
Micromixers are the key component that allow lab-on-a-chip and micro total analysis systems to reach the correct level of mixing for any given process. This paper proposes a novel, simple, passive micromixer design characterized by a planar accordion-shape geometry. The geometrical characteristics of the presented design were analyzed numerically in the range of 0.01 < Re < 100 based on the micromixer performance. The performance of the most efficient design was experimentally investigated by means of fluorescence microscopy for a range of low diffusion coefficients, 10-12 < D < 10-11 m2/s. The micromixer structure was fabricated in a simple single-step process using maskless lithography and soft lithography. The experimental results showed a very good agreement with the predicted numerical results. This micromixer design including a single serpentine unit (1-SERP) displayed an efficiency higher than 90% (mixing length = 6.4 mm) creating a pressure drop of about 500 Pa at Re = 0.1 and 60 kPa at Re = 10. A mixing efficiency of almost 100% was readily reached when three serpentine units were included (3-SERP). Finally, the potential diagnostic value of the presented microdevice was validated experimentally for Red Blood Cell (RBC) lysis.
An efficient planar accordion-shaped micromixer: from biochemical mixing to biological application.
Cosentino, Armando; Madadi, Hojjat; Vergara, Paola; Vecchione, Raffaele; Causa, Filippo; Netti, Paolo Antonio
2015-12-14
Micromixers are the key component that allow lab-on-a-chip and micro total analysis systems to reach the correct level of mixing for any given process. This paper proposes a novel, simple, passive micromixer design characterized by a planar accordion-shape geometry. The geometrical characteristics of the presented design were analyzed numerically in the range of 0.01 < Re < 100 based on the micromixer performance. The performance of the most efficient design was experimentally investigated by means of fluorescence microscopy for a range of low diffusion coefficients, 10(-12) < D < 10(-11) m(2)/s. The micromixer structure was fabricated in a simple single-step process using maskless lithography and soft lithography. The experimental results showed a very good agreement with the predicted numerical results. This micromixer design including a single serpentine unit (1-SERP) displayed an efficiency higher than 90% (mixing length = 6.4 mm) creating a pressure drop of about 500 Pa at Re = 0.1 and 60 kPa at Re = 10. A mixing efficiency of almost 100% was readily reached when three serpentine units were included (3-SERP). Finally, the potential diagnostic value of the presented microdevice was validated experimentally for Red Blood Cell (RBC) lysis.
PLACE: an open-source python package for laboratory automation, control, and experimentation.
Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper
2015-02-01
In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
Pitts, James Daniel
Rotary ultrasonic machining (RUM), a hybrid process combining ultrasonic machining and diamond grinding, was created to increase material removal rates for the fabrication of hard and brittle workpieces. The objective of this research was to experimentally derive empirical equations for the prediction of multiple machined surface roughness parameters for helically pocketed rotary ultrasonic machined Zerodur glass-ceramic workpieces by means of a systematic statistical experimental approach. A Taguchi parametric screening design of experiments was employed to systematically determine the RUM process parameters with the largest effect on mean surface roughness. Next empirically determined equations for the seven common surface quality metrics were developed via Box-Behnken surface response experimental trials. Validation trials were conducted resulting in predicted and experimental surface roughness in varying levels of agreement. The reductions in cutting force and tool wear associated with RUM, reported by previous researchers, was experimentally verified to also extended to helical pocketing of Zerodur glass-ceramic.
Evaluation of Selected Chemical Processes for Production of Low-cost Silicon, Phase 3
NASA Technical Reports Server (NTRS)
Blocher, J. M., Jr.; Browning, M. F.
1979-01-01
The construction of the 50 MT Si/year experimental process system development unit was deferred until FY 1980, and the fluidized bed, zinc vaporizer, by-product condenser, and electrolytic cell were combined with auxiliary units, capable of supporting 8-hour batchwise operation, to form the process development unit (PDU), which is scheduled to be in operation by October 1, 1979. The design of the PDU and objectives of its operation are discussed. Experimental program support activities described relate to: (1) a wetted-wall condensor; (2) fluidized-bed modeling; (3) zinc chloride electrolysis; and (4) zinc vaporizer.
NASA Technical Reports Server (NTRS)
Goldman, A. M., Jr.
1980-01-01
An experimental 20/30 GHz communications satellite conceptual design is described which employs multiple-beam paraboloid reflector antennas coupled to a TDMA transponder. It is shown that the satellite employs solid state GaAs FET power amplifiers and low noise amplifiers while signal processing and switching takes place on-board the spacecraft. The proposed areas to be served by this satellite would be the continental U.S. plus Alaska, Hawaii, Puerto Rico, and the Virgin Islands, as well as southern Canada and Mexico City. Finally, attention is given to the earth stations which are designed to be low cost.
Participative Budgeting as a Communication Process: A Model and Experiment.
1978-01-01
Control Group Design Methodology Setting Condition Measurements Summary :, ~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -a...criticized the experimental design at length , particularly the experimental variat ion of the control dimension. In his view , Ti the in terpreta t ion of...Thus, th is approach is adopted to avoid the poss ihi lir- -- of th e testing effect. The P o s t — T e s t Cr1 -- , Control Group Design -\\mong the
Design of simulated moving bed for separation of fumaric acid with a little fronting phenomenon.
Choi, Jae-Hwan; Kang, Mun-Seok; Lee, Chung-Gi; Wang, Nien-Hwa Linda; Mun, Sungyong
2017-03-31
The production of fumaric acid through a biotechnological pathway has grown in importance because of its potential value in related industries. This has sparked an interest in developing an economically-efficient process for separation of fumaric acid (product of interest) from acetic acid (by-product). This study aimed to develop a simulated moving bed (SMB) chromatographic process for such separation in a systematic way. As a first step for this work, commercially available adsorbents were screened for their applicability to the considered separation, which revealed that an Amberchrom-CG71C resin had a sufficient potential to become an adsorbent of the targeted SMB. Using this adsorbent, the intrinsic parameters of fumaric and acetic acids were determined and then applied to optimizing the SMB process under consideration. The optimized SMB process was tested experimentally, from which the yield of fumaric-acid product was found to become lower than expected in the design. An investigation about the reason for such problem revealed that it was attributed to a fronting phenomenon occurring in the solute band of fumaric acid. To resolve this issue, the extent of the fronting was evaluated quantitatively using an experimental axial dispersion coefficient for fumaric acid, which was then considered in the design of the SMB of interest. The SMB experimental results showed that the SMB design based on the consideration of the fumaric-acid fronting could guarantee the attainment of both high purity (>99%) and high yield (>99%) for fumaric-acid product under the desorbent consumption of 2.6 and the throughput of 0.36L/L/h. Copyright © 2017 Elsevier B.V. All rights reserved.
Experimental demonstration of photon upconversion via cooperative energy pooling
Weingarten, Daniel H.; LaCount, Michael D.; van de Lagemaat, Jao; ...
2017-03-15
Photon upconversion is a fundamental interaction of light and matter that has applications in fields ranging from bioimaging to microfabrication. However, all photon upconversion methods demonstrated thus far involve challenging aspects, including requirements of high excitation intensities, degradation in ambient air, requirements of exotic materials or phases, or involvement of inherent energy loss processes. Here we experimentally demonstrate a mechanism of photon upconversion in a thin film, binary mixture of organic chromophores that provides a pathway to overcoming the aforementioned disadvantages. This singlet-based process, called Cooperative Energy Pooling (CEP), utilizes a sensitizer-acceptor design in which multiple photoexcited sensitizers resonantly andmore » simultaneously transfer their energies to a higher-energy state on a single acceptor. Data from this proof-of-concept implementation is fit by a proposed model of the CEP process. As a result, design guidelines are presented to facilitate further research and development of more optimized CEP systems.« less
Experimental demonstration of photon upconversion via cooperative energy pooling
Weingarten, Daniel H.; LaCount, Michael D.; van de Lagemaat, Jao; Rumbles, Garry; Lusk, Mark T.; Shaheen, Sean E.
2017-01-01
Photon upconversion is a fundamental interaction of light and matter that has applications in fields ranging from bioimaging to microfabrication. However, all photon upconversion methods demonstrated thus far involve challenging aspects, including requirements of high excitation intensities, degradation in ambient air, requirements of exotic materials or phases, or involvement of inherent energy loss processes. Here we experimentally demonstrate a mechanism of photon upconversion in a thin film, binary mixture of organic chromophores that provides a pathway to overcoming the aforementioned disadvantages. This singlet-based process, called Cooperative Energy Pooling (CEP), utilizes a sensitizer-acceptor design in which multiple photoexcited sensitizers resonantly and simultaneously transfer their energies to a higher-energy state on a single acceptor. Data from this proof-of-concept implementation is fit by a proposed model of the CEP process. Design guidelines are presented to facilitate further research and development of more optimized CEP systems. PMID:28294129
NASA Astrophysics Data System (ADS)
Wang, Xiao; Zhang, Cheng; Li, Pin; Wang, Kai; Hu, Yang; Zhang, Peng; Liu, Huixia
2012-11-01
A central composite rotatable experimental design(CCRD) is conducted to design experiments for laser transmission joining of thermoplastic-Polycarbonate (PC). The artificial neural network was used to establish the relationships between laser transmission joining process parameters (the laser power, velocity, clamp pressure, scanning number) and joint strength and joint seam width. The developed mathematical models are tested by analysis of variance (ANOVA) method to check their adequacy and the effects of process parameters on the responses and the interaction effects of key process parameters on the quality are analyzed and discussed. Finally, the desirability function coupled with genetic algorithm is used to carry out the optimization of the joint strength and joint width. The results show that the predicted results of the optimization are in good agreement with the experimental results, so this study provides an effective method to enhance the joint quality.
Experimental demonstration of photon upconversion via cooperative energy pooling
NASA Astrophysics Data System (ADS)
Weingarten, Daniel H.; Lacount, Michael D.; van de Lagemaat, Jao; Rumbles, Garry; Lusk, Mark T.; Shaheen, Sean E.
2017-03-01
Photon upconversion is a fundamental interaction of light and matter that has applications in fields ranging from bioimaging to microfabrication. However, all photon upconversion methods demonstrated thus far involve challenging aspects, including requirements of high excitation intensities, degradation in ambient air, requirements of exotic materials or phases, or involvement of inherent energy loss processes. Here we experimentally demonstrate a mechanism of photon upconversion in a thin film, binary mixture of organic chromophores that provides a pathway to overcoming the aforementioned disadvantages. This singlet-based process, called Cooperative Energy Pooling (CEP), utilizes a sensitizer-acceptor design in which multiple photoexcited sensitizers resonantly and simultaneously transfer their energies to a higher-energy state on a single acceptor. Data from this proof-of-concept implementation is fit by a proposed model of the CEP process. Design guidelines are presented to facilitate further research and development of more optimized CEP systems.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Development of single shot 1D-Raman scattering measurements for flames
NASA Astrophysics Data System (ADS)
Biase, Amelia; Uddi, Mruthunjaya
2017-11-01
The majority of energy consumption in the US comes from burning fossil fuels which increases the concentration of carbon dioxide in the atmosphere. The increasing concentration of carbon dioxide in the atmosphere has negative impacts on the environment. One solution to this problem is to study the oxy-combustion process. A pure oxygen stream is used instead of air for combustion. Products contain only carbon dioxide and water. It is easy to separate water from carbon dioxide by condensation and the carbon dioxide can be captured easily. Lower gas volume allows for easier removal of pollutants from the flue gas. The design of a system that studies the oxy-combustion process using advanced laser diagnostic techniques and Raman scattering measurements is presented. The experiments focus on spontaneous Raman scattering. This is one of the few techniques that can provide quantitative measurements of the concentration and temperature of different chemical species in a turbulent flow. The experimental design and process of validating the design to ensure the data is accurate is described. The Raman data collected form an experimental data base that is used for the validation of spontaneous Raman scattering in high pressure environments for the oxy-combustion process. NSF EEC 1659710.
NASA Technical Reports Server (NTRS)
Fatyga, M.; Norbury, John W.
1992-01-01
An experimental program at the Relativistic Heavy Ion Collider (RHIC) which is designed to study nonperturbative aspects of electrodynamics is outlined. Additional possibilities for new studies of electrodynamics via multiple electromagnetic processes are also described.
Computer Design Technology of the Small Thrust Rocket Engines Using CAE / CAD Systems
NASA Astrophysics Data System (ADS)
Ryzhkov, V.; Lapshin, E.
2018-01-01
The paper presents an algorithm for designing liquid small thrust rocket engine, the process of which consists of five aggregated stages with feedback. Three stages of the algorithm provide engineering support for design, and two stages - the actual engine design. A distinctive feature of the proposed approach is a deep study of the main technical solutions at the stage of engineering analysis and interaction with the created knowledge (data) base, which accelerates the process and provides enhanced design quality. The using multifunctional graphic package Siemens NX allows to obtain the final product -rocket engine and a set of design documentation in a fairly short time; the engine design does not require a long experimental development.
GilPavas, Edison; Dobrosz-Gómez, Izabela; Gómez-García, Miguel Ángel
2012-01-01
The Response Surface Methodology (RSM) was applied as a tool for the optimization of the operational conditions of the photo-degradation of highly concentrated PY12 wastewater, resulting from a textile industry located in the suburbs of Medellin (Colombia). The Box-Behnken experimental Design (BBD) was chosen for the purpose of response optimization. The photo-Fenton process was carried out in a laboratory-scale batch photo-reactor. A multifactorial experimental design was proposed, including the following variables: the initial dyestuff concentration, the H(2)O(2) and the Fe(+2) concentrations, as well as the UV wavelength radiation. The photo-Fenton process performed at the optimized conditions resulted in ca. 100% of dyestuff decolorization, 92% of COD and 82% of TOC degradation. A kinetic study was accomplished, including the identification of some intermediate compounds generated during the oxidation process. The water biodegradability reached a final DBO(5)/DQO = 0.86 value.
Optimization of electrocoagulation process for the treatment of landfill leachate
NASA Astrophysics Data System (ADS)
Huda, N.; Raman, A. A.; Ramesh, S.
2017-06-01
The main problem of landfill leachate is its diverse composition comprising of persistent organic pollutants (POPs) which must be removed before being discharge into the environment. In this study, the treatment of leachate using electrocoagulation (EC) was investigated. Iron was used as both the anode and cathode. Response surface methodology was used for experimental design and to study the effects of operational parameters. Central Composite Design was used to study the effects of initial pH, inter-electrode distance, and electrolyte concentration on color, and COD removals. The process could remove up to 84 % color and 49.5 % COD. The experimental data was fitted onto second order polynomial equations. All three factors were found to be significantly affect the color removal. On the other hand, electrolyte concentration was the most significant parameter affecting the COD removal. Numerical optimization was conducted to obtain the optimum process performance. Further work will be conducted towards integrating EC with other wastewater treatment processes such as electro-Fenton.
Integrated CFD modeling of gas turbine combustors
NASA Technical Reports Server (NTRS)
Fuller, E. J.; Smith, C. E.
1993-01-01
3D, curvilinear, multi-domain CFD analysis is becoming a valuable tool in gas turbine combustor design. Used as a supplement to experimental testing. CFD analysis can provide improved understanding of combustor aerodynamics and used to qualitatively assess new combustor designs. This paper discusses recent advancements in CFD combustor methodology, including the timely integration of the design (i.e. CAD) and analysis (i.e. CFD) processes. Allied Signal's F124 combustor was analyzed at maximum power conditions. The assumption of turbulence levels at the nozzle/swirler inlet was shown to be very important in the prediction of combustor exit temperatures. Predicted exit temperatures were compared to experimental rake data, and good overall agreement was seen. Exit radial temperature profiles were well predicted, while the predicted pattern factor was 25 percent higher than the harmonic-averaged experimental pattern factor.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes
Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian
2016-01-01
Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes. PMID:26751451
Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes.
Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian
2016-01-07
Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes.
ERIC Educational Resources Information Center
Vezzoli, Carlo; Penin, Lara
2006-01-01
Purpose: This paper aims to diffuse the concept of a multi-lateral learning process as a means to promote experimental didactics and research (and the cross-fertilization between these two activities) in the field of design of sustainable product-service systems (PSSs) and to consider the university campus as the locus for the design,…
Pisano, Roberto; Fissore, Davide; Barresi, Antonello A; Brayard, Philippe; Chouvenc, Pierre; Woinet, Bertrand
2013-02-01
This paper shows how to optimize the primary drying phase, for both product quality and drying time, of a parenteral formulation via design space. A non-steady state model, parameterized with experimentally determined heat and mass transfer coefficients, is used to define the design space when the heat transfer coefficient varies with the position of the vial in the array. The calculations recognize both equipment and product constraints, and also take into account model parameter uncertainty. Examples are given of cycles designed for the same formulation, but varying the freezing conditions and the freeze-dryer scale. These are then compared in terms of drying time. Furthermore, the impact of inter-vial variability on design space, and therefore on the optimized cycle, is addressed. With this regard, a simplified method is presented for the cycle design, which reduces the experimental effort required for the system qualification. The use of mathematical modeling is demonstrated to be very effective not only for cycle development, but also for solving problem of process transfer. This study showed that inter-vial variability remains significant when vials are loaded on plastic trays, and how inter-vial variability can be taken into account during process design.
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
NASA Astrophysics Data System (ADS)
Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal
2017-03-01
The resistance of polymeric materials to time-dependent plastic deformation is an important requirement of the fused deposition modeling (FDM) design process, its processed products, and their application for long-term loading, durability, and reliability. The creep performance of the material and part processed by FDM is the fundamental criterion for many applications with strict dimensional stability requirements, including medical implants, electrical and electronic products, and various automotive applications. Herein, the effect of FDM fabrication conditions on the flexural creep stiffness behavior of polycarbonate-acrylonitrile-butadiene-styrene processed parts was investigated. A relatively new class of experimental design called "definitive screening design" was adopted for this investigation. The effects of process variables on flexural creep stiffness behavior were monitored, and the best suited quadratic polynomial model with high coefficient of determination ( R 2) value was developed. This study highlights the value of response surface definitive screening design in optimizing properties for the products and materials, and it demonstrates its role and potential application in material processing and additive manufacturing.
Self-assembly kinetics of microscale components: A parametric evaluation
NASA Astrophysics Data System (ADS)
Carballo, Jose M.
The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.
ERIC Educational Resources Information Center
Datchuk, Shawn M.; Kubina, Richard M.
2013-01-01
Students with writing difficulties and learning disabilities struggle with many aspects of the writing process, including use of sentence-level skills. This literature review summarizes results from 19 published articles that used single-case or group-experimental and quasi-experimental designs to investigate effects of intervention on the…
ERIC Educational Resources Information Center
Bozorgian, Hossein; Pillay, Hitendra
2013-01-01
Listening used in language teaching refers to a complex process that allows us to understand spoken language. The current study, conducted in Iran with an experimental design, investigated the effectiveness of teaching listening strategies delivered in L1 (Persian) and its effect on listening comprehension in L2. Five listening strategies:…
Carolyn Hunsaker
2013-01-01
The Kings River Experimental Watersheds (KREW) study was designed to (1) characterize the variability in watershed attributes considered important to understanding processes and health of headwater streams and forest watersheds and (2) evaluate forest restoration treatments. The KREW is a paired watershed experiment located in the headwaters of the Kings River Basin...
Experimental Study of Middle-Term Training in Social Cognition in Preschoolers
ERIC Educational Resources Information Center
Houssa, Marine; Nader-Grosbois, Nathalie
2016-01-01
In an experimental design, we examined the effects of middle-term training in social information processing (SIP) and in Theory of Mind (ToM) on preschoolers' social cognition and social adjustment. 48 preschoolers took part in a pre-test and post-test session involving cognitive, socio-cognitive and social adjustment (direct and indirect)…
Structural and functional connectivity as a driver of hillslope erosion following disturbance
USDA-ARS?s Scientific Manuscript database
Hydrologic response to rainfall input on fragmented or burnt hillslopes is strongly influenced by the ensuing connectivity of runoff and erosion processes. Yet, cross-scale process connectivity is seldom evaluated in field studies due scale limitations in experimental design. This study quantified...
Self-Disclosure as an Exchange Process: Reinforcement Effects.
ERIC Educational Resources Information Center
Taylor, Dalmas A.
In association with an extensive examination of the disclosure literature, this paper describes two laboratory studies designed to yield information regarding the effects of reinforcement on self-disclosing behaviors in an exchange process. In one series, the experimenters manipulated the patterns of personal reward/cost experiences, hypothesizing…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cristescu, I.; Cristescu, I. R.; Doerr, L.
2008-07-15
The ITER Isotope Separation System (ISS) and Water Detritiation System (WDS) should be integrated in order to reduce potential chronic tritium emissions from the ISS. This is achieved by routing the top (protium) product from the ISS to a feed point near the bottom end of the WDS Liquid Phase Catalytic Exchange (LPCE) column. This provides an additional barrier against ISS emissions and should mitigate the memory effects due to process parameter fluctuations in the ISS. To support the research activities needed to characterize the performances of various components for WDS and ISS processes under various working conditions and configurationsmore » as needed for ITER design, an experimental facility called TRENTA representative of the ITER WDS and ISS protium separation column, has been commissioned and is in operation at TLK The experimental program on TRENTA facility is conducted to provide the necessary design data related to the relevant ITER operating modes. The operation availability and performances of ISS-WDS have impact on ITER fuel cycle subsystems with consequences on the design integration. The preliminary experimental data on TRENTA facility are presented. (authors)« less
Nie, Lei; Hu, Mingming; Yan, Xu; Guo, Tingting; Wang, Haibin; Zhang, Sheng; Qu, Haibin
2018-05-03
This case study described a successful application of the quality by design (QbD) principles to a coupling process development of insulin degludec. Failure mode effects analysis (FMEA) risk analysis was first used to recognize critical process parameters (CPPs). Five CPPs, including coupling temperature (Temp), pH of desB30 solution (pH), reaction time (Time), desB30 concentration (Conc), and molar equivalent of ester per mole of desB30 insulin (MolE), were then investigated using a fractional factorial design. The curvature effect was found significant, indicating the requirement of second-order models. Afterwards, a central composite design was built with an augmented star and center points study. Regression models were developed for the CPPs to predict the purity and yield of predegludec using above experimental data. The R 2 and adjusted R 2 were higher than 96 and 93% for the two models respectively. The Q 2 values were more than 80% indicating a good predictive ability of models. MolE was found to be the most significant factor affecting both yield and purity of predegludec. Temp, pH, and Conc were also significant for predegludec purity, while Time appeared to remarkably influence the yield model. The multi-dimensional design space and normal operating region (NOR) with a robust setpoint were determined using a probability-based Monte-Carlo simulation method. The verified experimental results showed that the design space was reliable and effective. This study enriches the understanding of acetylation process and is instructional to other complicated operations in biopharmaceutical engineering.
González-Sáiz, J M; Esteban-Díez, I; Rodríguez-Tecedor, S; Pérez-Del-Notario, N; Arenzana-Rámila, I; Pizarro, C
2014-12-15
The aim of the present work was to evaluate the effect of the main factors conditioning accelerated ageing processes (oxygen dose, chip dose, wood origin, toasting degree and maceration time) on the phenolic and chromatic profiles of red wines by using a multivariate strategy based on experimental design methodology. The results obtained revealed that the concentrations of monomeric anthocyanins and flavan-3-ols could be modified through the application of particular experimental conditions. This fact was particularly remarkable since changes in phenolic profile were closely linked to changes observed in chromatic parameters. The main strength of this study lies in the possibility of using its conclusions as a basis to make wines with specific colour properties based on quality criteria. To our knowledge, the influence of such a large number of alternative ageing parameters on wine phenolic composition and chromatic attributes has not been studied previously using a comprehensive experimental design methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie
2016-01-01
The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-05-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-01-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
Development of Switchable Polarity Solvent Draw Solutes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Aaron D.
Results of a computational fluid dynamic (CFD) study of flow and heat transfer in a printed circuit heat exchanger (PCHE) geometry are presented. CFD results obtained from a two-plate model are compared to corresponding experimental results for the validation. This process provides the basis for further application of the CFD code to PCHE design and performance analysis in a variety of internal flow geometries. As a part of the code verification and validation (V&V) process, CFD simulation of a single semicircular straight channel under laminar isothermal conditions was also performed and compared to theoretical results. This comparison yielded excellent agreementmore » with the theoretical values. The two-plate CFD model based on the experimental PCHE design overestimated the effectiveness and underestimated the pressure drop. However, it is found that the discrepancy between the CFD result and experimental data was mainly caused by the uncertainty in the geometry of heat exchanger during the fabrication. The CFD results obtained using a slightly smaller channel diameter yielded good agreement with the experimental data. A separate investigation revealed that the average channel diameter of the OSU PCHE after the diffusion-bonding was 1.93 mm on the cold fluid side and 1.90 mm on the hot fluid side which are both smaller than the nominal design value. Consequently, the CFD code was shown to have sufficient capability to evaluate the heat exchanger thermal-hydraulic performance.« less
NASA Astrophysics Data System (ADS)
Ueno, Tetsuro; Hino, Hideitsu; Hashimoto, Ai; Takeichi, Yasuo; Sawada, Masahiro; Ono, Kanta
2018-01-01
Spectroscopy is a widely used experimental technique, and enhancing its efficiency can have a strong impact on materials research. We propose an adaptive design for spectroscopy experiments that uses a machine learning technique to improve efficiency. We examined X-ray magnetic circular dichroism (XMCD) spectroscopy for the applicability of a machine learning technique to spectroscopy. An XMCD spectrum was predicted by Gaussian process modelling with learning of an experimental spectrum using a limited number of observed data points. Adaptive sampling of data points with maximum variance of the predicted spectrum successfully reduced the total data points for the evaluation of magnetic moments while providing the required accuracy. The present method reduces the time and cost for XMCD spectroscopy and has potential applicability to various spectroscopies.
NASA Astrophysics Data System (ADS)
Shaylinda, M. Z. N.; Hamidi, A. A.; Mohd, N. A.; Ariffin, A.; Irvan, D.; Hazreek, Z. A. M.; Nizam, Z. M.
2018-04-01
In this research, the performance of polyferric chloride and tapioca flour as composite coagulants for partially stabilized leachate was investigated. Response surface methodology (RSM) was used to optimize the coagulation and flocculation process of partially stabilized leachate. Central composite design a standard design tool in RSM was applied to evaluate the interactions and effects of dose and pH. Dose 0.2 g/L Fe and pH 4.71 were the optimum value suggested by RSM. Experimental test based on the optimum condition, resulted in 95.9%, 94.6% and 50.4% of SS, color and COD removals, respectively. The percentage difference recorded between experimental and model responses was <5%. Therefore, it can be concluded that RSM was an appropriate optimization tool for coagulation and flocculation process.
Fashion sketch design by interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Mok, P. Y.; Wang, X. X.; Xu, J.; Kwok, Y. L.
2012-11-01
Computer aided design is vitally important for the modern industry, particularly for the creative industry. Fashion industry faced intensive challenges to shorten the product development process. In this paper, a methodology is proposed for sketch design based on interactive genetic algorithms. The sketch design system consists of a sketch design model, a database and a multi-stage sketch design engine. First, a sketch design model is developed based on the knowledge of fashion design to describe fashion product characteristics by using parameters. Second, a database is built based on the proposed sketch design model to define general style elements. Third, a multi-stage sketch design engine is used to construct the design. Moreover, an interactive genetic algorithm (IGA) is used to accelerate the sketch design process. The experimental results have demonstrated that the proposed method is effective in helping laypersons achieve satisfied fashion design sketches.
Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider
NASA Astrophysics Data System (ADS)
Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.
2010-03-01
In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.
NASA Airframe Icing Research Overview Past and Current
NASA Technical Reports Server (NTRS)
Potapczuk, Mark
2009-01-01
This slide presentation reviews the past and current research that NASA has done in the area of airframe icing. Both the history experimental efforts and model development to understand the process and problem of ice formation are reviewed. This has resulted in the development of new experimental methods, advanced icing simulation software, flight dynamics and experimental databases that have an impact on design, testing, construction and certification and qualification of the aircraft and its sub-systems.
Influence of detergents on water drift in cooling towers
NASA Astrophysics Data System (ADS)
Vitkovicova, Rut
An influence of detergents on the water drift from the cooling tower was experimentally investigated. For this experimental measurements was used a model cooling tower, especially an experimental aerodynamic line, which is specially designed for the measurement and monitoring of processes taking place around the eliminators of the liquid phase. The effect of different concentrations of detergent in the cooling water on the drift of water droplets from a commonly used type eliminator was observed with visualization methods.
General Science, Ninth Grade: Theme III and Theme IV. Experimental.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.
This document was designed to help teachers provide ninth grade students in New York City with opportunities to learn about scientific processes as well as basic reasoning skills which underlie problem-solving processes in scientific and nonscientific disciplines. The first section of the guide, "The Environment," contains lessons which…
A Virtual Laboratory for Digital Signal Processing
ERIC Educational Resources Information Center
Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu
2006-01-01
This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…
The Red and White Yeast Lab: An Introduction to Science as a Process.
ERIC Educational Resources Information Center
White, Brian T.
1999-01-01
Describes an experimental system based on an engineered strain of bakers' yeast that is designed to involve students in the process by which scientific knowledge is generated. Students are asked to determine why the yeast grow to form a reproducible pattern of red and white. (WRM)
Development of Optimal Stressor Scenarios for New Operational Energy Systems
2017-12-01
Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical
Application of Plackett-Burman experimental design in the development of muffin using adlay flour
NASA Astrophysics Data System (ADS)
Valmorida, J. S.; Castillo-Israel, K. A. T.
2018-01-01
The application of Plackett-Burman experimental design was made to identify significant formulation and process variables in the development of muffin using adlay flour. Out of the seven screened variables, levels of sugar, levels of butter and baking temperature had the most significant influence on the product model in terms of physicochemical and sensory acceptability. Results of the experiment further demonstrate the effectiveness of Plackett-Burman design in choosing the best adlay variety for muffin production. Hence, the statistical method used in the study permits an efficient selection of important variables needed in the development of muffin from adlay which can be optimized using response surface methodology.
Seal material development test program
NASA Technical Reports Server (NTRS)
1971-01-01
A program designed to characterize an experimental fluoroelastomer material designated AF-E-124D, is examined. Tests conducted include liquid nitrogen load compression tests, flexure tests and valve seal tests, ambient and elevated temperature compression set tests, and cleaning and flushing fluid exposure tests. The results of these tests indicate the AF-E-124D is a good choice for a cryogenic seal, since it exhibits good low temperature sealing characteristics and resistance to permanent set. The status of this material as an experimental fluorelastomer is stressed and recommended. Activity includes definition and control of critical processing to ensure consistent material properties. Design, fabrication and test of this and other materials is recommended in valve and static seal applications.
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
Subsequent to the design review, a series of tests was conducted on simulated modules to demonstrate that all environmental specifications (wind loading, hailstone impact, thermal cycling, and humidity cycling) are satisfied by the design. All tests, except hailstone impact, were successfully completed. The assembly sequence was simplified by virtue of eliminating the frame components and assembly steps. Performance was improved by reducing the module edge border required to accommodate the frame of the preliminary design module. An ultrasonic rolling spot bonding technique was selected for use in the machine to perform the aluminum interconnect to cell metallization electrical joints required in the MEPSDU module configuration. This selection was based on extensive experimental tests and economic analyses.
Iurian, Sonia; Turdean, Luana; Tomuta, Ioan
2017-01-01
This study focuses on the development of a drug product based on a risk assessment-based approach, within the quality by design paradigm. A prolonged release system was proposed for paliperidone (Pal) delivery, containing Kollidon ® SR as an insoluble matrix agent and hydroxypropyl cellulose, hydroxypropyl methylcellulose (HPMC), or sodium carboxymethyl cellulose as a hydrophilic polymer. The experimental part was preceded by the identification of potential sources of variability through Ishikawa diagrams, and failure mode and effects analysis was used to deliver the critical process parameters that were further optimized by design of experiments. A D-optimal design was used to investigate the effects of Kollidon SR ratio ( X 1 ), the type of hydrophilic polymer ( X 2 ), and the percentage of hydrophilic polymer ( X 3 ) on the percentages of dissolved Pal over 24 h ( Y 1 - Y 9 ). Effects expressed as regression coefficients and response surfaces were generated, along with a design space for the preparation of a target formulation in an experimental area with low error risk. The optimal formulation contained 27.62% Kollidon SR and 8.73% HPMC and achieved the prolonged release of Pal, with low burst effect, at ratios that were very close to the ones predicted by the model. Thus, the parameters with the highest impact on the final product quality were studied, and safe ranges were established for their variations. Finally, a risk mitigation and control strategy was proposed to assure the quality of the system, by constant process monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busack, Craig A.; Schroder, Steven L.; Young, Sewall F.
2002-11-01
Genetic work for 2001 consisted of two major phases, both reported on here. The first is a DNA microsatellite analysis of several hundred juveniles from the experimental spawning channel at the Cle Elum Supplementation Research Facility, using the genetic markers to assign the juveniles to parents, and thus judge reproductive success of individual fish. The second is a reevaluation and revision of plans for studying domestication in the spring chinook supplementation effort. The pedigree analysis was significant in three respects. First, it showed that this approach can be successfully applied to the spawning channel research. Secondly it showed that thismore » approach does indeed yield very useful information about the relative reproductive success of fish in the channel. Finally, it showed that this information can yield additional information about the experimental design. Of the 961 juveniles on which analysis was attempted, 774 yielded enough genetic information to be used in the pedigree analysis. Of these, 754 were assigned to males and females known to have been placed into the channel. Of the other 20, all were assignable to females, but sires were unknown. The genotypes of 17 of these were consistent with a single theoretical male genotype, suggesting a single precocial male sired them. The inferred parentage of the fish demonstrated that there had been substantial leakage of juveniles from one section of the channel into another. Reproductive success of females was fairly even, but success of males varied considerably. In a group of seven males (including the hypothetical one), one contributed 79% of the progeny analyzed, and three contributed none. The domestication experimental design evaluation was prompted by a critical review of the project by the Independent Scientific Review Panel (ISRP). The ISRP review set into motion a design revision process which extended beyond the contract period; the report presented here is intended to be an account of our work through the end of the contract period, so does not include developments beyond that point. As such, combined with the upcoming 2002 report, it will provide a complete record of our process through the experimental design revision process. The current report contains the following: (1) An explanation of the general concept of domestication, and why domestication is a concern in the YKFP spring chinook program; (2) A discussion of the basics of experimental design for domestication; (3) A history of domestication experimental design for domestication in the YKFP; (4) A review of potential designs that would answer the ISRP's criticisms; (5) A revised design containing the following elements--A control line under continuous hatchery culture (i.e.; no spawning in the wild); use of the Naches population, where appropriate, as a wild control line; (6) Cryopreservation of sperm for later evaluation of long-term genetic trend; and (7) Continuous monitoring of phenotypic trend in the supplemented line.« less
Earth integrated design: office dormitory facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapira, H. B.; Barnes, P. R.
1980-01-01
The generation process of the design of the Joint Institute for Heavy Ion Research is described. Architectural and energy considerations are discussed. The facility will contain living quarters for guest scientists who come to Oak Ridge to conduct short experiments and sleeping alcoves for local researchers on long experimental shifts as well as office space. (MHR)
Knotworking and the Visibilization of Learning in Building Design
ERIC Educational Resources Information Center
Kerosuo, Hannele; Mäki, Tarja; Korpela, Jenni
2015-01-01
Purpose: This paper aims to study the visibilization of learning in the context of developing a new collaborative practice, knotworking, in building design. The case under study describes the process of learning from the initiation of knotworking to its experimentation. The implementation of new building information modeling tools acted as an…
NASA Astrophysics Data System (ADS)
Triplett, Michael D.; Rathman, James F.
2009-04-01
Using statistical experimental design methodologies, the solid lipid nanoparticle design space was found to be more robust than previously shown in literature. Formulation and high shear homogenization process effects on solid lipid nanoparticle size distribution, stability, drug loading, and drug release have been investigated. Experimentation indicated stearic acid as the optimal lipid, sodium taurocholate as the optimal cosurfactant, an optimum lecithin to sodium taurocholate ratio of 3:1, and an inverse relationship between mixing time and speed and nanoparticle size and polydispersity. Having defined the base solid lipid nanoparticle system, β-carotene was incorporated into stearic acid nanoparticles to investigate the effects of introducing a drug into the base solid lipid nanoparticle system. The presence of β-carotene produced a significant effect on the optimal formulation and process conditions, but the design space was found to be robust enough to accommodate the drug. β-Carotene entrapment efficiency averaged 40%. β-Carotene was retained in the nanoparticles for 1 month. As demonstrated herein, solid lipid nanoparticle technology can be sufficiently robust from a design standpoint to become commercially viable.
Synthetic in vitro transcriptional oscillators
Kim, Jongmin; Winfree, Erik
2011-01-01
The construction of synthetic biochemical circuits from simple components illuminates how complex behaviors can arise in chemistry and builds a foundation for future biological technologies. A simplified analog of genetic regulatory networks, in vitro transcriptional circuits, provides a modular platform for the systematic construction of arbitrary circuits and requires only two essential enzymes, bacteriophage T7 RNA polymerase and Escherichia coli ribonuclease H, to produce and degrade RNA signals. In this study, we design and experimentally demonstrate three transcriptional oscillators in vitro. First, a negative feedback oscillator comprising two switches, regulated by excitatory and inhibitory RNA signals, showed up to five complete cycles. To demonstrate modularity and to explore the design space further, a positive-feedback loop was added that modulates and extends the oscillatory regime. Finally, a three-switch ring oscillator was constructed and analyzed. Mathematical modeling guided the design process, identified experimental conditions likely to yield oscillations, and explained the system's robust response to interference by short degradation products. Synthetic transcriptional oscillators could prove valuable for systematic exploration of biochemical circuit design principles and for controlling nanoscale devices and orchestrating processes within artificial cells. PMID:21283141
NASA Astrophysics Data System (ADS)
Chen, Zhen; Wei, Zhengying; Wei, Pei; Chen, Shenggui; Lu, Bingheng; Du, Jun; Li, Junfeng; Zhang, Shuzhe
2017-12-01
In this work, a set of experiments was designed to investigate the effect of process parameters on the relative density of the AlSi10Mg parts manufactured by SLM. The influence of laser scan speed v, laser power P and hatch space H, which were considered as the dominant parameters, on the powder melting and densification behavior was also studied experimentally. In addition, the laser energy density was introduced to evaluate the combined effect of the above dominant parameters, so as to control the SLM process integrally. As a result, a high relative density (> 97%) was obtained by SLM at an optimized laser energy density of 3.5-5.5 J/mm2. Moreover, a parameter-densification map was established to visually select the optimum process parameters for the SLM-processed AlSi10Mg parts with elevated density and required mechanical properties. The results provide an important experimental guidance for obtaining AlSi10Mg components with full density and gradient functional porosity by SLM.
NASA Technical Reports Server (NTRS)
Rey, Charles A.
1991-01-01
The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.
NASA Astrophysics Data System (ADS)
Rey, Charles A.
1991-03-01
The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.
NASA Astrophysics Data System (ADS)
Rusu-Anghel, S.
2017-01-01
Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.
Sun, Rui; Ismail, Tamer M; Ren, Xiaohan; Abd El-Salam, M
2015-05-01
In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on the combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Petrovic, J.; Pale, P.; Jeren, B.
2017-01-01
This study aimed to investigate the effects of using online formative assessments on students' learning achievements. Using a quasi-experimental study design with one control group (no formative assessments available), and two experimental groups receiving feedback in available online formative assessments (knowledge of the correct response--KCR,…
High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters
2017-04-22
signatures which can be used for direct, non -invasive, comparison with experimental diagnostics can be produced. This research will be directly... experimental campaign is critical to developing general design philosophies for low-power plasmoid formation, the complexity of non -linear plasma processes...advanced space propulsion. The work consists of numerical method development, physical model development, and systematic studies of the non -linear
Strategic Mobility 21: Baseline Joint Experimentation Campaign Plan
2008-06-19
including energy. The Value Stream Analysis Future State then designed Kaizens (process optimizations) for an improved Future State to help drive waste...Recommended Improvements and Experimentation Opportunities Initial recommended Kaizens (improvement opportunities) for waste reduction, constraint...Trucking, Service Craft Logistics, BNSF, and Madison Warehouse, Inc. • Kaizen 1 (Figure 17): Full upload electronically of the Dole ANS files • Kaizen
Development of Crystallizer for Advanced Aqueous Reprocessing Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tadahiro Washiya; Atsuhiro Shibata; Toshiaki Kikuchi
2006-07-01
Crystallization is one of the remarkable technologies for future fuel reprocessing process that has safety and economical advantages. Japan Atomic Energy Agency (JAEA) (former Japan Nuclear Cycle Development Institute), Mitsubishi Material Corporation and Saitama University have been developing the crystallization process. In previous study, we carried out experimental studies with uranium, MOX and spent fuel conditions, and flowsheet analysis was considered. In association with these studies, an innovative continuous crystallizer and its system was developed to ensure high process performance. From the design study, an annular type continuous crystallizer was selected as the most promising design, and performance was confirmedmore » by small-scale test and engineering scale demonstration at uranium crystallization conditions. In this paper, the design study and the demonstration test results are described. (authors)« less
NASA Astrophysics Data System (ADS)
Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie
The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.
Szilágyi, N; Kovács, R; Kenyeres, I; Csikor, Zs
2013-01-01
Biofilm development in a fixed bed biofilm reactor system performing municipal wastewater treatment was monitored aiming at accumulating colonization and maximum biofilm mass data usable in engineering practice for process design purposes. Initially a 6 month experimental period was selected for investigations where the biofilm formation and the performance of the reactors were monitored. The results were analyzed by two methods: for simple, steady-state process design purposes the maximum biofilm mass on carriers versus influent load and a time constant of the biofilm growth were determined, whereas for design approaches using dynamic models a simple biofilm mass prediction model including attachment and detachment mechanisms was selected and fitted to the experimental data. According to a detailed statistical analysis, the collected data have not allowed us to determine both the time constant of biofilm growth and the maximum biofilm mass on carriers at the same time. The observed maximum biofilm mass could be determined with a reasonable error and ranged between 438 gTS/m(2) carrier surface and 843 gTS/m(2), depending on influent load, and hydrodynamic conditions. The parallel analysis of the attachment-detachment model showed that the experimental data set allowed us to determine the attachment rate coefficient which was in the range of 0.05-0.4 m d(-1) depending on influent load and hydrodynamic conditions.
Semester-long inquiry-based molecular biology laboratory: Transcriptional regulation in yeast.
Oelkers, Peter M
2017-03-04
A single semester molecular biology laboratory has been developed in which students design and execute a project examining transcriptional regulation in Saccharomyces cerevisiae. Three weeks of planning are allocated to developing a hypothesis through literature searches and use of bioinformatics. Common experimental plans address a cell process and how three genes that encode for proteins involved in that process are transcriptionally regulated in response to changing environmental conditions. Planning includes designing oligonucleotides to amplify the putative promoters of the three genes of interest. After the PCR, each product is cloned proximal to β-galactosidase in a yeast reporter plasmid. Techniques used include agarose electrophoresis, extraction of DNA from agarose, plasmid purification from bacteria, restriction digestion, ligation, and bacterial transformation. This promoter/reporter plasmid is then transformed into yeast. Transformed yeast are cultured in conditions prescribed in the experimental design, lysed and β-galactosidase activity is measured. The course provides an independent research experience in a group setting. Notebooks are maintained on-line with regular feedback. Projects culminate with the presentation of a poster worth 60% of the grade. Over the last three years, about 65% of students met expectations for experimental design, data acquisition, and analysis. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(2):145-151, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
Molecular interactions of alcohols with zeolite BEA and MOR frameworks.
Stückenschneider, Kai; Merz, Juliane; Schembecker, Gerhard
2013-12-01
Zeolites can adsorb small organic molecules such as alcohols from a fermentation broth. Also in the zeolite-catalyzed conversion of alcohols to biofuels, biochemicals, or gasoline, adsorption is the first step. Several studies have investigated the adsorption of alcohols in different zeolites experimentally, but computational investigations in this field have mostly been restricted to zeolite MFI. In this study, the adsorption of C1-C4 alcohols in BEA and MOR was investigated using density functional theory (DFT). Calculated adsorption geometries and the corresponding energies of the designed cluster models were comparable to periodic calculations, and the adsorption energies were in the same range as the corresponding computational and experimental values reported in the literature for zeolite MFI. Thus, BEA and MOR may be good adsorption materials for alcohols in the field of downstream processing and catalysis. Aside from the DFT calculations, adsorption isotherms were determined experimentally in this study from aqueous solutions. For BEA, the adsorption of significant amounts of alcohol from aqueous solution was observed experimentally. In contrast, MOR was loaded with only a very small amount of alcohol. Although differences were found between the affinities obtained from gas-phase DFT calculations and those observed experimentally in aqueous solution, the computational data presented here represent molecular level information on the geometries and energies of C1-C4 alcohols adsorbed in zeolites BEA and MOR. This knowledge should prove very useful in the design of zeolite materials intended for use in adsorption and catalytic processes, as it allows adsorption behavior to be predicted via judiciously designed computational models.
NASA Astrophysics Data System (ADS)
Amran, M. A. M.; Idayu, N.; Faizal, K. M.; Sanusi, M.; Izamshah, R.; Shahir, M.
2016-11-01
In this study, the main objective is to determine the percentage difference of part weight between experimental and simulation work. The effect of process parameters on weight of plastic part is also investigated. The process parameters involved were mould temperature, melt temperature, injection time and cooling time. Autodesk Simulation Moldflow software was used to run the simulation of the plastic part. Taguchi method was selected as Design of Experiment to conduct the experiment. Then, the simulation result was validated with the experimental result. It was found that the minimum and maximum percentage of differential of part weight between simulation and experimental work are 0.35 % and 1.43 % respectively. In addition, the most significant parameter that affected part weight is the mould temperature, followed by melt temperature, injection time and cooling time.
Behaviors of printed circuit boards due to microwave supported curing process of coating materials.
Bremerkamp, Felix; Nowottnick, Mathias; Seehase, Dirk; Bui, Trinh Dung
2012-01-01
The Application of a microwave supported curing process for coatings in the field of electronic industry poses a challenge. Here the implementation of this technology is represented. Within the scope of the investigation special PCB Test Layouts were designed and the polymer curing process examined by the method of dielectric analysis. Furthermore the coupling of microwave radiation with conductive PCB structures was analyzed experimentally by means of special test boards. The formation of standing waves and regular heating distribution along the conductive wires on the PCB could be observed. The experimental results were compared with numerical simulation. In this context the numerical analysis of microwave PCB interaction led to important findings concerning wave propagation on wired PCB. The final valuation demonstrated a substantial similarity between numerical simulations and experimental results.
Inverse problems and optimal experiment design in unsteady heat transfer processes identification
NASA Technical Reports Server (NTRS)
Artyukhin, Eugene A.
1991-01-01
Experimental-computational methods for estimating characteristics of unsteady heat transfer processes are analyzed. The methods are based on the principles of distributed parameter system identification. The theoretical basis of such methods is the numerical solution of nonlinear ill-posed inverse heat transfer problems and optimal experiment design problems. Numerical techniques for solving problems are briefly reviewed. The results of the practical application of identification methods are demonstrated when estimating effective thermophysical characteristics of composite materials and thermal contact resistance in two-layer systems.
López, Alejandro; Coll, Andrea; Lescano, Maia; Zalazar, Cristina
2017-05-05
In this work, the suitability of the UV/H 2 O 2 process for commercial herbicides mixture degradation was studied. Glyphosate, the herbicide most widely used in the world, was mixed with other herbicides that have residual activity as 2,4-D and atrazine. Modeling of the process response related to specific operating conditions like initial pH and initial H 2 O 2 to total organic carbon molar ratio was assessed by the response surface methodology (RSM). Results have shown that second-order polynomial regression model could well describe and predict the system behavior within the tested experimental region. It also correctly explained the variability in the experimental data. Experimental values were in good agreement with the modeled ones confirming the significance of the model and highlighting the success of RSM for UV/H 2 O 2 process modeling. Phytotoxicity evolution throughout the photolytic degradation process was checked through germination tests indicating that the phytotoxicity of the herbicides mixture was significantly reduced after the treatment. The end point for the treatment at the operating conditions for maximum TOC conversion was also identified.
NASA Astrophysics Data System (ADS)
Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.
2018-05-01
The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.
Design and experimental verification for optical module of optical vector-matrix multiplier.
Zhu, Weiwei; Zhang, Lei; Lu, Yangyang; Zhou, Ping; Yang, Lin
2013-06-20
Optical computing is a new method to implement signal processing functions. The multiplication between a vector and a matrix is an important arithmetic algorithm in the signal processing domain. The optical vector-matrix multiplier (OVMM) is an optoelectronic system to carry out this operation, which consists of an electronic module and an optical module. In this paper, we propose an optical module for OVMM. To eliminate the cross talk and make full use of the optical elements, an elaborately designed structure that involves spherical lenses and cylindrical lenses is utilized in this optical system. The optical design software package ZEMAX is used to optimize the parameters and simulate the whole system. Finally, experimental data is obtained through experiments to evaluate the overall performance of the system. The results of both simulation and experiment indicate that the system constructed can implement the multiplication between a matrix with dimensions of 16 by 16 and a vector with a dimension of 16 successfully.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
Product design for energy reduction in concurrent engineering: An Inverted Pyramid Approach
NASA Astrophysics Data System (ADS)
Alkadi, Nasr M.
Energy factors in product design in concurrent engineering (CE) are becoming an emerging dimension for several reasons; (a) the rising interest in "green design and manufacturing", (b) the national energy security concerns and the dramatic increase in energy prices, (c) the global competition in the marketplace and global climate change commitments including carbon tax and emission trading systems, and (d) the widespread recognition of the need for sustainable development. This research presents a methodology for the intervention of energy factors in concurrent engineering product development process to significantly reduce the manufacturing energy requirement. The work presented here is the first attempt at integrating the design for energy in concurrent engineering framework. It adds an important tool to the DFX toolbox for evaluation of the impact of design decisions on the product manufacturing energy requirement early during the design phase. The research hypothesis states that "Product Manufacturing Energy Requirement is a Function of Design Parameters". The hypothesis was tested by conducting experimental work in machining and heat treating that took place at the manufacturing lab of the Industrial and Management Systems Engineering Department (IMSE) at West Virginia University (WVU) and at a major U.S steel manufacturing plant, respectively. The objective of the machining experiment was to study the effect of changing specific product design parameters (Material type and diameter) and process design parameters (metal removal rate) on a gear head lathe input power requirement through performing defined sets of machining experiments. The objective of the heat treating experiment was to study the effect of varying product charging temperature on the fuel consumption of a walking beams reheat furnace. The experimental work in both directions have revealed important insights into energy utilization in machining and heat-treating processes and its variance based on product, process, and system design parameters. In depth evaluation to how the design and manufacturing normally happen in concurrent engineering provided a framework to develop energy system levels in machining within the concurrent engineering environment using the method of "Inverted Pyramid Approach", (IPA). The IPA features varying levels of output energy based information depending on the input design parameters that is available during each stage (level) of the product design. The experimental work, the in-depth evaluation of design and manufacturing in CE, and the developed energy system levels in machining provided a solid base for the development of the model for the design for energy reduction in CE. The model was used to analyze an example part where 12 evolving designs were thoroughly reviewed to investigate the sensitivity of energy to design parameters in machining. The model allowed product design teams to address manufacturing energy concerns early during the design stage. As a result, ranges for energy sensitive design parameters impacting product manufacturing energy consumption were found in earlier levels. As designer proceeds to deeper levels in the model, this range tightens and results in significant energy reductions.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
A senior manufacturing laboratory for determining injection molding process capability
NASA Technical Reports Server (NTRS)
Wickman, Jerry L.; Plocinski, David
1992-01-01
The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.
When Theater Comes to Engineering Design: Oh How Creative They Can Be.
Pfeiffer, Ferris M; Bauer, Rachel E; Borgelt, Steve; Burgoyne, Suzanne; Grant, Sheila; Hunt, Heather K; Pardoe, Jennie J; Schmidt, David C
2017-07-01
The creative process is fun, complex, and sometimes frustrating, but it is critical to the future of our nation and progress in science, technology, engineering, mathematics (STEM), as well as other fields. Thus, we set out to see if implementing methods of active learning typical to the theater department could impact the creativity of senior capstone design students in the bioengineering (BE) department. Senior bioengineering capstone design students were allowed to self-select into groups. Prior to the beginning of coursework, all students completed a validated survey measuring engineering design self-efficacy. The control and experimental groups both received standard instruction, but in addition the experimental group received 1 h per week of creativity training developed by a theater professor. Following the semester, the students again completed the self-efficacy survey. The surveys were examined to identify differences in the initial and final self-efficacy in the experimental and control groups over the course of the semester. An analysis of variance was used to compare the experimental and control groups with p < 0.05 considered significant. Students in the experimental group reported more than a twofold (4.8 (C) versus 10.9 (E)) increase of confidence. Additionally, students in the experimental group were more motivated and less anxious when engaging in engineering design following the semester of creativity instruction. The results of this pilot study indicate that there is a significant potential to improve engineering students' creative self-efficacy through the implementation of a "curriculum of creativity" which is developed using theater methods.
Experimental Design for Parameter Estimation of Gene Regulatory Networks
Timmer, Jens
2012-01-01
Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noe, F; Diadone, Isabella; Lollmann, Marc
There is a gap between kinetic experiment and simulation in their views of the dynamics of complex biomolecular systems. Whereas experiments typically reveal only a few readily discernible exponential relaxations, simulations often indicate complex multistate behavior. Here, a theoretical framework is presented that reconciles these two approaches. The central concept is dynamical fingerprints which contain peaks at the time scales of the dynamical processes involved with amplitudes determined by the experimental observable. Fingerprints can be generated from both experimental and simulation data, and their comparison by matching peaks permits assignment of structural changes present in the simulation to experimentally observedmore » relaxation processes. The approach is applied here to a test case interpreting single molecule fluorescence correlation spectroscopy experiments on a set of fluorescent peptides with molecular dynamics simulations. The peptides exhibit complex kinetics shown to be consistent with the apparent simplicity of the experimental data. Moreover, the fingerprint approach can be used to design new experiments with site-specific labels that optimally probe specific dynamical processes in the molecule under investigation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Lindsay
This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design pointsmore » in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.« less
Effects of video-game play on information processing: a meta-analytic investigation.
Powers, Kasey L; Brooks, Patricia J; Aldrich, Naomi J; Palladino, Melissa A; Alfieri, Louis
2013-12-01
Do video games enhance cognitive functioning? We conducted two meta-analyses based on different research designs to investigate how video games impact information-processing skills (auditory processing, executive functions, motor skills, spatial imagery, and visual processing). Quasi-experimental studies (72 studies, 318 comparisons) compare habitual gamers with controls; true experiments (46 studies, 251 comparisons) use commercial video games in training. Using random-effects models, video games led to improved information processing in both the quasi-experimental studies, d = 0.61, 95% CI [0.50, 0.73], and the true experiments, d = 0.48, 95% CI [0.35, 0.60]. Whereas the quasi-experimental studies yielded small to large effect sizes across domains, the true experiments yielded negligible effects for executive functions, which contrasted with the small to medium effect sizes in other domains. The quasi-experimental studies appeared more susceptible to bias than were the true experiments, with larger effects being reported in higher-tier than in lower-tier journals, and larger effects reported by the most active research groups in comparison with other labs. The results are further discussed with respect to other moderators and limitations in the extant literature.
Rincon, Sergio A; Paoletti, Anne
2016-01-01
Unveiling the function of a novel protein is a challenging task that requires careful experimental design. Yeast cytokinesis is a conserved process that involves modular structural and regulatory proteins. For such proteins, an important step is to identify their domains and structural organization. Here we briefly discuss a collection of methods commonly used for sequence alignment and prediction of protein structure that represent powerful tools for the identification homologous domains and design of structure-function approaches to test experimentally the function of multi-domain proteins such as those implicated in yeast cytokinesis.
Hydrodynamic cavitation: from theory towards a new experimental approach
NASA Astrophysics Data System (ADS)
Lucia, Umberto; Gervino, Gianpiero
2009-09-01
Hydrodynamic cavitation is analysed by a global thermodynamics principle following an approach based on the maximum irreversible entropy variation that has already given promising results for open systems and has been successfully applied in specific engineering problems. In this paper we present a new phenomenological method to evaluate the conditions inducing cavitation. We think this method could be useful in the design of turbo-machineries and related technologies: it represents both an original physical approach to cavitation and an economical saving in planning because the theoretical analysis could allow engineers to reduce the experimental tests and the costs of the design process.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Nystrom, G. A.; Bardina, J.; Lombard, C. K.
1987-01-01
This paper describes the application of the conservative supra characteristic method (CSCM) to predict the flow around two-dimensional slot injection cooled cavities in hypersonic flow. Seven different numerical solutions are presented that model three different experimental designs. The calculations manifest outer flow conditions including the effects of nozzle/lip geometry, angle of attack, nozzle inlet conditions, boundary and shear layer growth and turbulance on the surrounding flow. The calculations were performed for analysis prior to wind tunnel testing for sensitivity studies early in the design process. Qualitative and quantitative understanding of the flows for each of the cavity designs and design recommendations are provided. The present paper demonstrates the ability of numerical schemes, such as the CSCM method, to play a significant role in the design process.
NASA Astrophysics Data System (ADS)
Roosta, M.; Ghaedi, M.; Shokri, N.; Daneshfar, A.; Sahraei, R.; Asghari, A.
2014-01-01
The present study was aimed to experimental design optimization applied to removal of malachite green (MG) from aqueous solution by ultrasound-assisted removal onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as FESEM, TEM, BET, and UV-vis measurements. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time on MG removal were studied using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Kinetic models such as pseudo -first order, pseudo-second order, Elovich and intraparticle diffusion models applicability was tested for experimental data and the second-order equation and intraparticle diffusion models control the kinetic of the adsorption process. The small amount of proposed adsorbent (0.015 g) is applicable for successful removal of MG (RE > 99%) in short time (4.4 min) with high adsorption capacity (140-172 mg g-1).
13 CFR 121.701 - What SBIR programs are subject to size determinations?
Code of Federal Regulations, 2010 CFR
2010-01-01
... between any Federal agency and any small business for the performance of experimental, developmental, or... design, development, and improvement of prototypes and new processes to meet specific requirements. ...
13 CFR 121.701 - What SBIR programs are subject to size determinations?
Code of Federal Regulations, 2012 CFR
2012-01-01
... between any Federal agency and any small business for the performance of experimental, developmental, or... design, development, and improvement of prototypes and new processes to meet specific requirements. ...
13 CFR 121.701 - What SBIR programs are subject to size determinations?
Code of Federal Regulations, 2011 CFR
2011-01-01
... between any Federal agency and any small business for the performance of experimental, developmental, or... design, development, and improvement of prototypes and new processes to meet specific requirements. ...
NASA Technical Reports Server (NTRS)
Humenik, F. M.; Bosque, M. A.
1983-01-01
Fundamental experimental data base for turbulent flow mixing models is provided and better prediction of the more complex turbulent chemical reacting flows. Analytical application to combustor design is provided and a better fundamental understanding of the combustion process.
Klima, J
2011-02-01
An overview of possible mechanisms by which sonication can influence electrochemical processes is given. Four mechanisms are discussed: – acoustic streaming; – microstreaming and turbulence due to cavitation; – formation of microjets in the course of collapse of cavitation bubble; – shock waves; and possible effects are illustrated on several examples. The most effective process is formation of microjets,which can not only decrease diffusion layer thickness under 1 lm, but also activate (depassivate) electrode surface. Design of experimental arrangement with maximum participation of microjets is proposed. Two approaches are proposed: – focusing of ultrasound on the working electrode and reduction of energy losses by over-pressure; – ‘‘tuning” the reactor to obtain resonance, i.e. formation of stationary waves by activating reactor in itsresonant mode. Copyright © 2010 Elsevier B.V. All rights reserved.
Ge Sun; Johnny Boggs; Steven G. McNulty; Devendra M. Amatya; Carl C. Trettin; Zhaohua Dai; James M. Vose; Ileana B. La Torre Torres; Timothy Callahan
2008-01-01
Understanding the hydrologic processes is the first step in making sound watershed management decisions including designing Best Management Practices for nonpoint source pollution control. Over the past fifty years, various forest experimental watersheds have been instrumented across the Carolinas through collaborative studies among federal, state, and private...
ERIC Educational Resources Information Center
Akpo, Essegbemon; Crane, Todd A.; Vissoh, Pierre V.; Tossou, Rigobert C.
2015-01-01
Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand how the way of organizing social learning affects…
Structural and functional connectivity as a driver of hillslope erosion following disturbance
C. Jason Williams; Frederick B. Pierson; Pete Robichaud; Osama Z. Al-Hamdan; Jan Boll; Eva K. Strand
2016-01-01
Hydrologic response to rainfall on fragmented or burnt hillslopes is strongly influenced by the ensuing connectivity of runoff and erosion processes. Yet cross-scale process connectivity is seldom evaluated in field studies owing to scale limitations in experimental design. This study quantified surface susceptibility and hydrologic response across point to...
A THEORETICAL MODEL FOR RESEARCH IN EDUCATION.
ERIC Educational Resources Information Center
ARMSTRONG, JENNY R.
THE FAILURE OF EDUCATIONAL RESEARCH TO CONTRIBUTE LARGE CONSISTENT BODIES OF KNOWLEDGE ABOUT THE EDUCATIONAL PROCESS HAS BEEN DUE TO FIVE MAJOR FACTORS--(1) FAULTY EXPERIMENTAL DESIGN, (2) FAILURE TO CONSIDER ALL OF THE MAJOR INPUT ELEMENTS OF THE EDUCATIONAL PROCESS, (3) FAILURE TO MAKE MEANINGFUL COMPARISONS (FOR EXAMPLE THE CONTROL GROUP IS NOT…
Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity
Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.
2010-01-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183
Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.
Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L
2010-02-01
An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.
Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.
Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P
2017-03-01
We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Approach to design space from retrospective quality data.
Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon
2016-01-01
Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.
NASA Astrophysics Data System (ADS)
Cuetos, M. J.; Gómez, X.; Escapa, A.; Morán, A.
Various mixtures incorporating a simulated organic fraction of municipal solid wastes and blood from a poultry slaughterhouse were used as substrate in a dark fermentation process for the production of hydrogen. The individual and interactive effects of hydraulic retention time (HRT), solid content in the feed (%TS) and proportion of residues (%Blood) on bio-hydrogen production were studied in this work. A central composite design and response surface methodology were employed to determine the optimum conditions for the hydrogen production process. Experimental results were approximated to a second-order model with the principal effects of the three factors considered being statistically significant (P < 0.05). The production of hydrogen obtained from the experimental point at conditions close to best operability was 0.97 L Lr -1 day -1. Moreover, a desirability function was employed in order to optimize the process when a second, methanogenic, phase is coupled with it. In this last case, the optimum conditions lead to a reduction in the production of hydrogen when the optimization process involves the maximization of intermediary products.
Benchmark tests for a Formula SAE Student car prototyping
NASA Astrophysics Data System (ADS)
Mariasiu, Florin
2011-12-01
Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2016-06-01
Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.
Leisure Activities for the Development of Creative Intelligence in Mathematical Problem Solving
ERIC Educational Resources Information Center
Castro, Angélica Mercedes Tumbaco; Guerra, Galo Ernestro Cabanilla; Brito, Christian Antonio Pavón; Chávez, Tannia Gabriela Acosta
2018-01-01
The present work studies the influence of leisure activities on the creative intelligence of the students. An experimental pre- and post-test design was carried out with individuals selected in a sampling process. The design identifies the ease of students to place themselves in possible contexts and solve them mathematically through Polya's…
ERIC Educational Resources Information Center
Backman, Desiree; Gonzaga, Gian; Sugerman, Sharon; Francis, Dona; Cook, Sara
2011-01-01
Objective: To examine the impact of fresh fruit availability at worksites on the fruit and vegetable consumption and related psychosocial determinants of low-wage employees. Design: A prospective, randomized block experimental design. Setting: Seven apparel manufacturing and 2 food processing worksites. Participants: A convenience sample of 391…
Granular activated carbon (GAC) is an effective treatment technique for the removal of some toxic organics from drinking water or wastewater, however, it can be a relatively expensive process, especially if it is designed improperly. A rapid method for the design of large-scale f...
Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules
NASA Astrophysics Data System (ADS)
Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix
2009-02-01
Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.
NASA Technical Reports Server (NTRS)
Bache, George
1993-01-01
Validation of CFD codes is a critical first step in the process of developing CFD design capability. The MSFC Pump Technology Team has recognized the importance of validation and has thus funded several experimental programs designed to obtain CFD quality validation data. The first data set to become available is for the SSME High Pressure Fuel Turbopump Impeller. LDV Data was taken at the impeller inlet (to obtain a reliable inlet boundary condition) and three radial positions at the impeller discharge. Our CFD code, TASCflow, is used within the Propulsion and Commercial Pump industry as a tool for pump design. The objective of this work, therefore, is to further validate TASCflow for application in pump design. TASCflow was used to predict flow at the impeller discharge for flowrates of 80, 100 and 115 percent of design flow. Comparison to data has been made with encouraging results.
ERIC Educational Resources Information Center
Johnson, Theodore M.; Alfke, Dorothy
The purpose of this study was to investigate whether success in the Science - A Process Approach (SAPA) process of classification designed for primary grade children is contingent upon the children's developmental level as defined by Piaget's theory. The investigators sought to determine whether children who had reached the concrete operational…
NASA Technical Reports Server (NTRS)
1980-01-01
Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.
ERIC Educational Resources Information Center
Rattanavich, Saowalak
2017-01-01
This experimental study aims to investigate the effects of three vocational English classes, each one academic semester in duration, and using the concentrated language encounter approach and reciprocal peer teaching strategies. This study employed a time-series design with one pre-experiment and two post-experiments. Discourse and frequency…
ERIC Educational Resources Information Center
Tunaboylu, Ceren; Demir, Ergül
2017-01-01
The aim of this study is to investigate the effect of using the interactive whiteboard in mathematics teaching process on the 7th-grade students' achievement. This study was conducted as experimental design. Experimental and control groups were composed of 58 7th-grade students from one school in the 2015-2016 educational year in Ankara. As a…
Study of the Effect of Swelling on Irradiation Assisted Stress Corrosion Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teysseyre, Sebastien Paul
2016-09-01
This report describes the methodology used to study the effect of swelling on the crack growth rate of an irradiation-assisted stress corrosion crack that is propagating in highly irradiated stainless steel 304 material irradiated to 33 dpa in the Experimental Breeder Reactor-II. The material selection, specimens design, experimental apparatus and processes are described. The results of the current test are presented.
Experimental design for evaluating WWTP data by linear mass balances.
Le, Quan H; Verheijen, Peter J T; van Loosdrecht, Mark C M; Volcke, Eveline I P
2018-05-15
A stepwise experimental design procedure to obtain reliable data from wastewater treatment plants (WWTPs) was developed. The proposed procedure aims at determining sets of additional measurements (besides available ones) that guarantee the identifiability of key process variables, which means that their value can be calculated from other, measured variables, based on available constraints in the form of linear mass balances. Among all solutions, i.e. all possible sets of additional measurements allowing the identifiability of all key process variables, the optimal solutions were found taking into account two objectives, namely the accuracy of the identified key variables and the cost of additional measurements. The results of this multi-objective optimization problem were represented in a Pareto-optimal front. The presented procedure was applied to a full-scale WWTP. Detailed analysis of the relation between measurements allowed the determination of groups of overlapping mass balances. Adding measured variables could only serve in identifying key variables that appear in the same group of mass balances. Besides, the application of the experimental design procedure to these individual groups significantly reduced the computational effort in evaluating available measurements and planning additional monitoring campaigns. The proposed procedure is straightforward and can be applied to other WWTPs with or without prior data collection. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard
2016-12-29
The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less
Cetinceviz, Yucel; Bayindir, Ramazan
2012-05-01
The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Rui, E-mail: Sunsr@hit.edu.cn; Ismail, Tamer M., E-mail: temoil@aucegypt.edu; Ren, Xiaohan
Highlights: • The effects of moisture content on the burning process of MSW are investigated. • A two-dimensional mathematical model was built to simulate the combustion process. • Temperature distributions, process rates, gas species were measured and simulated. • The The conversion ratio of C/CO and N/NO in MSW are inverse to moisture content. - Abstract: In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on themore » combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k–ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW.« less
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
Bhansali, Archita H; Sangani, Darshan S; Mhatre, Shivani K; Sansgiry, Sujit S
2018-01-01
To compare three over-the-counter (OTC) Drug Facts panel versions for information processing optimization among college students. University of Houston students (N = 210) participated in a cross-sectional survey from January to May 2010. A current FDA label was compared to two experimental labels developed using the theory of CHREST to test information processing by re-positioning the warning information within the Drug Facts panel. Congruency was defined as placing like information together. Information processing was evaluated using the OTC medication Label Evaluation Process Model (LEPM): label comprehension, ease-of-use, attitude toward the product, product evaluation, and purchase intention. Experimental label with chunked congruent information (uses-directions-other information-warnings) was rated significantly higher than the current FDA label and had the best average scores among the LEPM information processing variables. If replications uphold these findings, the FDA label design might be revised to improve information processing.
Ceramic processing: Experimental design and optimization
NASA Technical Reports Server (NTRS)
Weiser, Martin W.; Lauben, David N.; Madrid, Philip
1992-01-01
The objectives of this paper are to: (1) gain insight into the processing of ceramics and how green processing can affect the properties of ceramics; (2) investigate the technique of slip casting; (3) learn how heat treatment and temperature contribute to density, strength, and effects of under and over firing to ceramic properties; (4) experience some of the problems inherent in testing brittle materials and learn about the statistical nature of the strength of ceramics; (5) investigate orthogonal arrays as tools to examine the effect of many experimental parameters using a minimum number of experiments; (6) recognize appropriate uses for clay based ceramics; and (7) measure several different properties important to ceramic use and optimize them for a given application.
Multivariate analysis techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bendavid, Josh; Fisher, Wade C.; Junk, Thomas R.
2016-01-01
The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually bothmore » be improved by separating signal events from background events with higher efficiency and purity.« less
Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj
2015-01-01
Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.
Experimental early-stage coalification of a peat sample and a peatified wood sample from Indonesia
Orem, W.H.; Neuzil, S.G.; Lerch, H.E.; Cecil, C.B.
1996-01-01
Experimental coalification of a peat sample and a buried wood sample from domed peat deposits in Indonesia was carried out to examine chemical structural changes in organic matter during early-stage coalification. The experiment (125 C, 408 atm lithostatic pressure, and 177 atm fluid pressure for 75 days) was designed to maintain both lithostatic and fluid pressure on the sample, but allow by-products that may retard coalification to escape. We refer to this design as a geologically open system. Changes in the elemental composition, and 13C NMR and FTIR spectra of the peat and wood after experimental coalification suggest preferential thermal decomposition of O-containing aliphatic organic compounds (probably cellulose) during early-stage coalification. The elemental compositions and 13C NMR spectra of the experimentally coalified peat and wood were generally similar to those of Miocene coal and coalified wood samples from Indonesia. Yields of lignin phenols in the peat and wood samples decreased following experimental coalification; the wood sample exhibited a larger change. Lignin phenol yields from the experimentally coalified peat and wood were comparable to yields of lignin phenols from Miocene Indonesian lignite and coalified wood. Changes in syringyl/vanillyl and p-hydroxy/vanillyl ratios suggest direct demethoxylation as a secondary process to demethylation of methoxyl groups during early coalification, and changes in lignin phenol yields and acid/aldehyde ratios point to a coupling between demethoxylation processes and reactions in the alkyl side chain bonds of the ??-carbon in lignin phenols.
2011-03-01
1.179 1 22 .289 POP-UP .000 1 22 .991 Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design ...POP-UP 2.104 1 22 .161 Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design : Intercept... design also limited the number of intended treatments. The experimental design originally was suppose to test all three adverse events that threaten
Alloy design for aircraft engines
NASA Astrophysics Data System (ADS)
Pollock, Tresa M.
2016-08-01
Metallic materials are fundamental to advanced aircraft engines. While perceived as mature, emerging computational, experimental and processing innovations are expanding the scope for discovery and implementation of new metallic materials for future generations of advanced propulsion systems.
Design and Field Experimentation of a Cooperative ITS Architecture Based on Distributed RSUs.
Moreno, Asier; Osaba, Eneko; Onieva, Enrique; Perallos, Asier; Iovino, Giovanni; Fernández, Pablo
2016-07-22
This paper describes a new cooperative Intelligent Transportation System architecture that aims to enable collaborative sensing services. The main goal of this architecture is to improve transportation efficiency and performance. The system, which has been proven within the participation in the ICSI (Intelligent Cooperative Sensing for Improved traffic efficiency) European project, encompasses the entire process of capture and management of available road data. For this purpose, it applies a combination of cooperative services and methods for data sensing, acquisition, processing and communication amongst road users, vehicles, infrastructures and related stakeholders. Additionally, the advantages of using the proposed system are exposed. The most important of these advantages is the use of a distributed architecture, moving the system intelligence from the control centre to the peripheral devices. The global architecture of the system is presented, as well as the software design and the interaction between its main components. Finally, functional and operational results observed through the experimentation are described. This experimentation has been carried out in two real scenarios, in Lisbon (Portugal) and Pisa (Italy).
Design and Field Experimentation of a Cooperative ITS Architecture Based on Distributed RSUs †
Moreno, Asier; Osaba, Eneko; Onieva, Enrique; Perallos, Asier; Iovino, Giovanni; Fernández, Pablo
2016-01-01
This paper describes a new cooperative Intelligent Transportation System architecture that aims to enable collaborative sensing services. The main goal of this architecture is to improve transportation efficiency and performance. The system, which has been proven within the participation in the ICSI (Intelligent Cooperative Sensing for Improved traffic efficiency) European project, encompasses the entire process of capture and management of available road data. For this purpose, it applies a combination of cooperative services and methods for data sensing, acquisition, processing and communication amongst road users, vehicles, infrastructures and related stakeholders. Additionally, the advantages of using the proposed system are exposed. The most important of these advantages is the use of a distributed architecture, moving the system intelligence from the control centre to the peripheral devices. The global architecture of the system is presented, as well as the software design and the interaction between its main components. Finally, functional and operational results observed through the experimentation are described. This experimentation has been carried out in two real scenarios, in Lisbon (Portugal) and Pisa (Italy). PMID:27455277
Modeling Remineralization of Desalinated Water by Micronized Calcite Dissolution.
Hasson, David; Fine, Larissa; Sagiv, Abraham; Semiat, Raphael; Shemer, Hilla
2017-11-07
A widely used process for remineralization of desalinated water consists of dissolution of calcite particles by flow of acidified desalinated water through a bed packed with millimeter-size calcite particles. An alternative process consists of calcite dissolution by slurry flow of micron-size calcite particles with acidified desalinated water. The objective of this investigation is to provide theoretical models enabling design of remineralization by calcite slurry dissolution with carbonic and sulfuric acids. Extensive experimental results are presented displaying the effects of acid concentration, slurry feed concentration, and dissolution contact time. The experimental data are shown to be in agreement within less than 10% with theoretical predictions based on the simplifying assumption that the slurry consists of uniform particles represented by the surface mean diameter of the powder. Agreement between theory and experiment is improved by 1-8% by taking into account the powder size distribution. Apart from the practical value of this work in providing a hitherto lacking design tool for a novel technology. The paper has the merit of being among the very few publications providing experimental confirmation to the theory describing reaction kinetics in a segregated flow system.
Capacitive Deionization of High-Salinity Solutions
Sharma, Ketki; Gabitto, Jorge; Mayes, Richard T.; ...
2014-12-22
Desalination of high salinity solutions has been studied using a novel experimental technique and a theoretical model. Neutron imaging has been employed to visualize lithium ions in mesoporous carbon materials, which are used as electrodes in capacitive deionization for water desalination. Experiments were conducted with a flow-through capacitive deionization cell designed for neutron imaging and with lithium chloride ( 6LiCl) as the electrolyte. Sequences of neutron images have been obtained at a relatively high concentration of lithium chloride ( 6LiCl) solution to provide information on the transport of ions within the electrodes. A new model that computes the individual ionicmore » concentration profiles inside mesoporous carbon electrodes has been used to simulate the capacitive deionization process. Modifications have also been introduced into the simulation model to calculate results at high electrolyte concentrations. Experimental data and simulation results provide insight into why capacitive deionization is not effective for desalination of high ionic-strength solutions. The combination of experimental information, obtained through neutron imaging, with the theoretical model will help in the design of capacitive deionization devices, which can improve the process for high ionic-strength solutions.« less
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.
Laboratory investigations of earthquake dynamics
NASA Astrophysics Data System (ADS)
Xia, Kaiwen
In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.
NASA Astrophysics Data System (ADS)
Huang, Wenkai; Huan, Shi; He, Junfeng; Jiang, Jichang
2018-03-01
In a split Hopkinson pressure bar (SHPB) experiment, the pasting quality of strain gauges will directly affect the accuracy of the measurement results. The traditional method of pasting the strain gauges is done manually by the experimenter. In the process of pasting, it is easy to shift or twist the strain gauge, and the experimental results are greatly affected by human factors. In this paper, a novel type automatic pasting device for strain gauges is designed and developed, which can be used to accurately and rapidly paste the strain gauges. The paste quality is reliable, and it can guarantee the consistency of SHPB experimental measurement. We found that a clamping force of 74 N achieved a success rate of 97%, whilst ensuring good adhesion.
Coelho, Pedro G; Hollister, Scott J; Flanagan, Colleen L; Fernandes, Paulo R
2015-03-01
Bone scaffolds for tissue regeneration require an optimal trade-off between biological and mechanical criteria. Optimal designs may be obtained using topology optimization (homogenization approach) and prototypes produced using additive manufacturing techniques. However, the process from design to manufacture remains a research challenge and will be a requirement of FDA design controls to engineering scaffolds. This work investigates how the design to manufacture chain affects the reproducibility of complex optimized design characteristics in the manufactured product. The design and prototypes are analyzed taking into account the computational assumptions and the final mechanical properties determined through mechanical tests. The scaffold is an assembly of unit-cells, and thus scale size effects on the mechanical response considering finite periodicity are investigated and compared with the predictions from the homogenization method which assumes in the limit infinitely repeated unit cells. Results show that a limited number of unit-cells (3-5 repeated on a side) introduce some scale-effects but the discrepancies are below 10%. Higher discrepancies are found when comparing the experimental data to numerical simulations due to differences between the manufactured and designed scaffold feature shapes and sizes as well as micro-porosities introduced by the manufacturing process. However good regression correlations (R(2) > 0.85) were found between numerical and experimental values, with slopes close to 1 for 2 out of 3 designs. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Iurian, Sonia; Turdean, Luana; Tomuta, Ioan
2017-01-01
This study focuses on the development of a drug product based on a risk assessment-based approach, within the quality by design paradigm. A prolonged release system was proposed for paliperidone (Pal) delivery, containing Kollidon® SR as an insoluble matrix agent and hydroxypropyl cellulose, hydroxypropyl methylcellulose (HPMC), or sodium carboxymethyl cellulose as a hydrophilic polymer. The experimental part was preceded by the identification of potential sources of variability through Ishikawa diagrams, and failure mode and effects analysis was used to deliver the critical process parameters that were further optimized by design of experiments. A D-optimal design was used to investigate the effects of Kollidon SR ratio (X1), the type of hydrophilic polymer (X2), and the percentage of hydrophilic polymer (X3) on the percentages of dissolved Pal over 24 h (Y1–Y9). Effects expressed as regression coefficients and response surfaces were generated, along with a design space for the preparation of a target formulation in an experimental area with low error risk. The optimal formulation contained 27.62% Kollidon SR and 8.73% HPMC and achieved the prolonged release of Pal, with low burst effect, at ratios that were very close to the ones predicted by the model. Thus, the parameters with the highest impact on the final product quality were studied, and safe ranges were established for their variations. Finally, a risk mitigation and control strategy was proposed to assure the quality of the system, by constant process monitoring. PMID:28331293
NASA Astrophysics Data System (ADS)
Reza, Syed Azer
This dissertation proposes the use of the emerging Micro-Electro-Mechanical Systems (MEMS) and agile lensing optical device technologies to design novel and powerful signal conditioning and sensing modules for advanced applications in optical communications, physical parameter sensing and RF/optical signal processing. For example, these new module designs have experimentally demonstrated exceptional features such as stable loss broadband operations and high > 60 dB optical dynamic range signal filtering capabilities. The first part of the dissertation describes the design and demonstration of digital MEMS-based signal processing modules for communication systems and sensor networks using the TI DLP (Digital Light Processing) technology. Examples of such modules include optical power splitters, narrowband and broadband variable fiber optical attenuators, spectral shapers and filters. Compared to prior works, these all-digital designs have advantages of repeatability, accuracy, and reliability that are essential for advanced communications and sensor applications. The next part of the dissertation proposes, analyzes and demonstrates the use of analog opto-fluidic agile lensing technology for sensor networks and test and measurement systems. Novel optical module designs for distance sensing, liquid level sensing, three-dimensional object shape sensing and variable photonic delay lines are presented and experimentally demonstrated. Compared to prior art module designs, the proposed analog-mode modules have exceptional performances, particularly for extreme environments (e.g., caustic liquids) where the free-space agile beam-based sensor provide remote non-contact access for physical sensing operations. The dissertation also presents novel modules involving hybrid analog-digital photonic designs that make use of the different optical device technologies to deliver the best features of both analog and digital optical device operations and controls. Digital controls are achieved through the use of the digital MEMS technology and analog controls are realized by employing opto-fluidic agile lensing technology and acousto-optic technology. For example, variable fiber-optic attenuators and spectral filters are proposed using the hybrid design. Compared to prior art module designs, these hybrid designs provide a higher module dynamic range and increased resolution that are critical in various advanced system applications. In summary, the dissertation shows the added power of hybrid optical designs using both the digital and analog photonic signal processing versus just all-digital or all-analog module designs.
Design Validation Methodology Development for an Aircraft Sensor Deployment System
NASA Astrophysics Data System (ADS)
Wowczuk, Zenovy S.
The OCULUS 1.0 Sensor Deployment concept design, was developed in 2004 at West Virginia University (WVU), outlined the general concept of a deployment system to be used on a C-130 aircraft. As a sequel, a new system, OCULUS 1.1, has been developed and designed. The new system transfers the concept system design to a safety of flight design, and also enhanced to a pre-production system to be used as the test bed to gain full military certification approval. The OCULUS 1.1 system has an implemented standard deployment system/procedure to go along with a design suited for military certification and implementation. This design process included analysis of the system's critical components and the generation of a critical component holistic model to be used as an analysis tool for future payload modification made to the system. Following the completion of the OCULUS 1.1 design, preparations and procedures for obtaining military airworthiness certification are described. The airworthiness process includes working with the agency overseeing all modifications to the normal operating procedures made to military C-130 aircraft and preparing the system for an experimental flight test. The critical steps in his process include developing a complete documentation package that details the analysis performed on the OCULUS 1.1 system and also the design of experiment flight test plan to analyze the system. Following the approval of the documentation and design of experiment an experimental flight test of the OCULUS 1.1 system was performed to verify the safety and airworthiness of the system. This test proved successfully that the OCULUS 1.1 system design was airworthy and approved for military use. The OCULUS 1.1 deployment system offers an open architecture design that is ideal for use as a sensor testing platform for developmental airborne sensors. The system's patented deployment methodology presents a simplistic approach to reaching the systems final operating position which offers the most robust field of view area of rear ramp deployment systems.
NASA Astrophysics Data System (ADS)
Park, Sahnggi; Kim, Kap-Joong; Kim, Duk-Jun; Kim, Gyungock
2009-02-01
Third order ring resonators are designed and their resonance frequency deviations are analyzed experimentally by processing them with E-beam lithography and ICP etching in a CMOS nano-Fabrication laboratory. We developed a reliable method to identify and reduce experimentally the degree of deviation of each ring resonance frequency before completion of the fabrication process. The identified deviations can be minimized by the way to be presented in this paper. It is expected that this method will provide a significant clue to make a high order multi-channel ring resonators.
Techniques for video compression
NASA Technical Reports Server (NTRS)
Wu, Chwan-Hwa
1995-01-01
In this report, we present our study on multiprocessor implementation of a MPEG2 encoding algorithm. First, we compare two approaches to implementing video standards, VLSI technology and multiprocessor processing, in terms of design complexity, applications, and cost. Then we evaluate the functional modules of MPEG2 encoding process in terms of their computation time. Two crucial modules are identified based on this evaluation. Then we present our experimental study on the multiprocessor implementation of the two crucial modules. Data partitioning is used for job assignment. Experimental results show that high speedup ratio and good scalability can be achieved by using this kind of job assignment strategy.
Application Of Numerical Modelling To Ribbed Wire Rod Dimensions Precision Increase
NASA Astrophysics Data System (ADS)
Szota, Piotr; Mróz, Sebastian; Stefanik, Andrzej
2007-05-01
The paper presents the results of theoretical and experimental investigations of the process of rolling square ribbed wire rod designed for concrete reinforcement. Numerical modelling of the process of rolling in the finishing and pre-finishing grooves was carried out using the Forge2005® software. In the investigation, particular consideration was given to the analysis of the effect of pre-finished band shape on the formation of ribs on the finished wire rod in the finishing groove. The results of theoretical studies were verified in experimental tests, which were carried out in a wire rolling mill.
Design of integration-ready metasurface-based infrared absorbers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogando, Karim, E-mail: karim@cab.cnea.gov.ar; Pastoriza, Hernán
2015-07-28
We introduce an integration ready design of metamaterial infrared absorber, highly compatible with many kinds of fabrication processes. We present the results of an exhaustive experimental characterization, including an analysis of the effects of single meta-atom geometrical parameters and collective arrangement. We confront the results with the theoretical interpretations proposed in the literature. Based on the results, we develop a set of practical design rules for metamaterial absorbers in the infrared region.
Scanning and Measuring Device for Diagnostic of Barrel Bore
NASA Astrophysics Data System (ADS)
Marvan, Ales; Hajek, Josef; Vana, Jan; Dvorak, Radim; Drahansky, Martin; Jankovych, Robert; Skvarek, Jozef
The article discusses the design, mechanical design, electronics and software for robot diagnosis of barrels with caliber of 120 mm to 155 mm. This diagnostic device is intended primarily for experimental research and verification of appropriate methods and technologies for the diagnosis of the main bore guns. Article also discusses the design of sensors and software, the issue of data processing and image reconstruction obtained by scanning of the surface of the bore.
Secula, Marius Sebastian; Cretescu, Igor; Cagnon, Benoit; Manea, Liliana Rozemarie; Stan, Corneliu Sergiu; Breaban, Iuliana Gabriela
2013-07-10
The aim of this study was to determine the effects of main factors and interactions on the color removal performance from dye solutions using the electrocoagulation process enhanced by adsorption on Granular Activated Carbon (GAC). In this study, a mathematical approach was conducted using a two-level fractional factorial design ( FFD ) for a given dye solution. Three textile dyes: Acid Blue 74, Basic Red 1, and Reactive Black 5 were used. Experimental factors used and their respective levels were: current density (2.73 or 27.32 A/m²), initial pH of aqueous dye solution (3 or 9), electrocoagulation time (20 or 180 min), GAC dose (0.1 or 0.5 g/L), support electrolyte (2 or 50 mM), initial dye concentration (0.05 or 0.25 g/L) and current type (Direct Current- DC or Alternative Pulsed Current- APC ). GAC-enhanced electrocoagulation performance was analyzed statistically in terms of removal efficiency, electrical energy, and electrode material consumptions, using modeling polynomial equations. The statistical significance of GAC dose level on the performance of GAC enhanced electrocoagulation and the experimental conditions that favor the process operation of electrocoagulation in APC regime were determined. The local optimal experimental conditions were established using a multi-objective desirability function method.
Secula, Marius Sebastian; Cretescu, Igor; Cagnon, Benoit; Manea, Liliana Rozemarie; Stan, Corneliu Sergiu; Breaban, Iuliana Gabriela
2013-01-01
The aim of this study was to determine the effects of main factors and interactions on the color removal performance from dye solutions using the electrocoagulation process enhanced by adsorption on Granular Activated Carbon (GAC). In this study, a mathematical approach was conducted using a two-level fractional factorial design (FFD) for a given dye solution. Three textile dyes: Acid Blue 74, Basic Red 1, and Reactive Black 5 were used. Experimental factors used and their respective levels were: current density (2.73 or 27.32 A/m2), initial pH of aqueous dye solution (3 or 9), electrocoagulation time (20 or 180 min), GAC dose (0.1 or 0.5 g/L), support electrolyte (2 or 50 mM), initial dye concentration (0.05 or 0.25 g/L) and current type (Direct Current—DC or Alternative Pulsed Current—APC). GAC-enhanced electrocoagulation performance was analyzed statistically in terms of removal efficiency, electrical energy, and electrode material consumptions, using modeling polynomial equations. The statistical significance of GAC dose level on the performance of GAC enhanced electrocoagulation and the experimental conditions that favor the process operation of electrocoagulation in APC regime were determined. The local optimal experimental conditions were established using a multi-objective desirability function method. PMID:28811405
Using factorial experimental design to evaluate the separation of plastics by froth flotation.
Salerno, Davide; Jordão, Helga; La Marca, Floriana; Carvalho, M Teresa
2018-03-01
This paper proposes the use of factorial experimental design as a standard experimental method in the application of froth flotation to plastic separation instead of the commonly used OVAT method (manipulation of one variable at a time). Furthermore, as is common practice in minerals flotation, the parameters of the kinetic model were used as process responses rather than the recovery of plastics in the separation products. To explain and illustrate the proposed methodology, a set of 32 experimental tests was performed using mixtures of two polymers with approximately the same density, PVC and PS (with mineral charges), with particle size ranging from 2 to 4 mm. The manipulated variables were frother concentration, air flow rate and pH. A three-level full factorial design was conducted. The models establishing the relationships between the manipulated variables and their interactions with the responses (first order kinetic model parameters) were built. The Corrected Akaike Information Criterion was used to select the best fit model and an analysis of variance (ANOVA) was conducted to identify the statistically significant terms of the model. It was shown that froth flotation can be used to efficiently separate PVC from PS with mineral charges by reducing the floatability of PVC, which largely depends on the action of pH. Within the tested interval, this is the factor that most affects the flotation rate constants. The results obtained show that the pure error may be of the same magnitude as the sum of squares of the errors, suggesting that there is significant variability within the same experimental conditions. Thus, special care is needed when evaluating and generalizing the process. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Xuanjun; Zeng, Xinwu; Gao, Dongbao; Shen, Weidong; Wang, Jianli; Wang, Shengchun
2017-03-01
The reflection characteristics of the unit cell, consisting of a subwavelength circular hole and a rigid wall, was discussed theoretically, and it was found that the phase shift of the reflected waves could cover almost 2π span by adjusting the hole radius when the acoustic waves normally impinge on it. Based on the analytical formulas, an acoustic metasurface (AMS) sample constructed by an array of unit cells with different radii was designed and fabricated. The sound pressure fields induced by the sample were then measured through the experimental setup and the reflected field pattern was derived after data processing. Experimental results and COMSOL simulations both demonstrated the fact that the designed AMS has the ability to reflect acoustic waves into an unusual yet controllable direction, verifying the correctness of the theory and design about the AMS in this paper. Simulations also show that the designed AMS has a narrow working bandwidth of 50 Hz around 800 Hz and its total thickness is about 1/8 of the incident wavelength, giving it the potential for the miniaturization and integration of acoustic devices.
Methodological considerations in the design and implementation of clinical trials.
Cirrincione, Constance T; Lavoie Smith, Ellen M; Pang, Herbert
2014-02-01
To review study design issues related to clinical trials led by oncology nurses, with special attention to those conducted within the cooperative group setting; to emphasize the importance of the statistician's role in the process of clinical trials. Studies available at clinicaltrials.gov using experimental designs that have been published in peer-reviewed journals; cooperative group trials are highlighted. The clinical trial is a primary means to test intervention efficacy. A properly designed and powered study with clear and measurable objectives is as important as the intervention itself. Collaboration among the study team, including the statistician, is central in developing and conducting appropriately designed studies. For optimal results, collaboration is an ongoing process that should begin early on. Copyright © 2014 Elsevier Inc. All rights reserved.
The Role of Formal Experiment Design in Hypersonic Flight System Technology Development
NASA Technical Reports Server (NTRS)
McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.
2002-01-01
Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.
USDA-ARS?s Scientific Manuscript database
Solution blow spinning (SBS) is a process to produce non-woven fiber sheets with high porosity and an extremely large amount of surface area. In this study, a Box-Behnken experimental design (BBD) was used to optimize the processing parameters for the production of nanofibers from polymer solutions ...
ERIC Educational Resources Information Center
van Hooft, Edwin A. J.; Born, Marise Ph.
2012-01-01
Intentional response distortion or faking among job applicants completing measures such as personality and integrity tests is a concern in personnel selection. The present study aimed to investigate whether eye-tracking technology can improve our understanding of the response process when faking. In an experimental within-participants design, a…
Improving Working Memory and Processing Speed of Students with Dyslexia in Nigeria
ERIC Educational Resources Information Center
Adubasim, Ijeoma
2018-01-01
This study investigated effective strategies for improving working memory and processing speed of students identified with dyslexia in Nigeria. The study adopted a quasi-experimental research design with the population made up of twenty four thousand seven hundred and twenty seven (24,727) senior secondary school students (S.S.2) in all the public…
NASA Astrophysics Data System (ADS)
Runnova, Anastasiya; Zhuravlev, Maxim; Kulanin, Roman; Protasov, Pavel; Hramov, Alexander; Koronovskii, Alexey
2018-02-01
In this paper we study the correlation between the neurophysiological processes and personal characteristics arising in the processes of human higher mental functions. We find that the activity of the brain correlates with the results of psychological tests (according to the Cattell test). Experimental studies and math processing are described for operation design with the registration of human multi-channel EEG data in two phases (the processes of passive wakefulness (background) and special psychological testing (active phase)).
Process characterization and Design Space definition.
Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine
2016-09-01
Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Chen, Shu-cheng S.
2011-01-01
The axial flow turbine off-design computer program AXOD has been upgraded to include the outlet guide vane (OGV) into its acceptable turbine configurations. The mathematical bases and the techniques used for the code implementation are described and discussed in lengths in this paper. This extended capability is verified and validated with two cases of highly loaded fan-drive turbines, designed and tested in the V/STOL Program of NASA. The first case is a 4 1/2-stage turbine with an average stage loading factor of 4.66, designed by Pratt & Whitney Aircraft. The second case is a 3 1/2-stage turbine with an average loading factor of 4.0, designed in-house by the NASA Lewis Research Center (now the NASA Glenn Research Center). Both cases were experimentally tested in the turbine facility located at the Glenn Research Center. The processes conducted in these studies are described in detail in this paper, and the results in comparison with the experimental data are presented and discussed. The comparisons between the AXOD results and the experimental data are in excellent agreement.
Using R in experimental design with BIBD: An application in health sciences
NASA Astrophysics Data System (ADS)
Oliveira, Teresa A.; Francisco, Carla; Oliveira, Amílcar; Ferreira, Agostinho
2016-06-01
Considering the implementation of an Experimental Design, in any field, the experimenter must pay particular attention and look for the best strategies in the following steps: planning the design selection, conduct the experiments, collect observed data, proceed to analysis and interpretation of results. The focus is on providing both - a deep understanding of the problem under research and a powerful experimental process at a reduced cost. Mainly thanks to the possibility of allowing to separate variation sources, the importance of Experimental Design in Health Sciences is strongly recommended since long time. Particular attention has been devoted to Block Designs and more precisely to Balanced Incomplete Block Designs - in this case the relevance states from the fact that these designs allow testing simultaneously a number of treatments bigger than the block size. Our example refers to a possible study of inter reliability of the Parkinson disease, taking into account the UPDRS (Unified Parkinson's disease rating scale) in order to test if there are significant differences between the specialists who evaluate the patients performances. Statistical studies on this disease were described for example in Richards et al (1994), where the authors investigate the inter-rater Reliability of the Unified Parkinson's Disease Rating Scale Motor Examination. We consider a simulation of a practical situation in which the patients were observed by different specialists and the UPDRS on assessing the impact of Parkinson's disease in patients was observed. Assigning treatments to the subjects following a particular BIBD(9,24,8,3,2) structure, we illustrate that BIB Designs can be used as a powerful tool to solve emerging problems in this area. Once a structure with repeated blocks allows to have some block contrasts with minimum variance, see Oliveira et al. (2006), the design with cardinality 12 was selected for the example. R software was used for computations.
Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang
2017-03-01
In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.
An integrated biotechnology platform for developing sustainable chemical processes.
Barton, Nelson R; Burgard, Anthony P; Burk, Mark J; Crater, Jason S; Osterhout, Robin E; Pharkya, Priti; Steer, Brian A; Sun, Jun; Trawick, John D; Van Dien, Stephen J; Yang, Tae Hoon; Yim, Harry
2015-03-01
Genomatica has established an integrated computational/experimental metabolic engineering platform to design, create, and optimize novel high performance organisms and bioprocesses. Here we present our platform and its use to develop E. coli strains for production of the industrial chemical 1,4-butanediol (BDO) from sugars. A series of examples are given to demonstrate how a rational approach to strain engineering, including carefully designed diagnostic experiments, provided critical insights about pathway bottlenecks, byproducts, expression balancing, and commercial robustness, leading to a superior BDO production strain and process.
[An experimental research on the fabrication of the fused porcelain to CAD/CAM molar crown].
Dai, Ning; Zhou, Yongyao; Liao, Wenhe; Yu, Qing; An, Tao; Jiao, Yiqun
2007-02-01
This paper introduced the fabrication process of the fused porcelain to molar crown with CAD/CAM technology. Firstly, preparation teeth data was retrieved by the 3D-optical measuring system. Then, we have reconstructed the inner surface designed the outer surface shape with the computer aided design software. Finally, the mini high-speed NC milling machine was used to produce the fused porcelain to CAD/CAM molar crown. The result has proved that the fabrication process is reliable and efficient. The dental restoration quality is steady and precise.
System software for the finite element machine
NASA Technical Reports Server (NTRS)
Crockett, T. W.; Knott, J. D.
1985-01-01
The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.
2010-06-01
Markus, 1994). Media richness theory rests on the assumption that organizations process information to reduce uncertainty and equivocality ( Daft ... Organization Design ), 554-571. Daft , R. L., & Macintosh, N. B. (1981). A tentative exploration into the amount and equivocality of information... design and customization. For instance, recent research demonstrates further how the performance of both Hierarchy and Edge organizations is
Screen Time: Alumni Magazines Have Their Designs on Mobile Devices
ERIC Educational Resources Information Center
Walker, Theresa
2011-01-01
Alumni magazines have their designs on mobile devices. The efforts are tied together, no matter the platform, by a desire for the magazine to be where its readers are and a spirit of experimentation that is akin to what is happening with social media. None of the magazine editors went into this process with any numerical expectations for…
ERIC Educational Resources Information Center
Arendasy, Martin E.; Sommer, Markus; Gittler, Georg
2010-01-01
Marked gender differences in three-dimensional mental rotation have been broadly reported in the literature in the last few decades. Various theoretical models and accounts were used to explain the observed differences. Within the framework of linking item design features of mental rotation tasks to cognitive component processes associated with…
TECHNOLOGY AND MANPOWER IN DESIGN AND DRAFTING 1965-75. MANPOWER RESEARCH BULLETIN NUMBER 12.
ERIC Educational Resources Information Center
Office of Manpower Policy, Evaluation, and Research (DOL), Washington, DC.
AS PART OF AN EXPERIMENTAL AND DEMONSTRATION PROJECT LAUNCHED BY THE DEPARTMENT OF LABOR IN 1965 TO EMPHASIZE LIKELY FUTURE TECHNOLOGICAL AND MANPOWER CHANGES, THIS STUDY OF THE DESIGN AND DRAFTING PROCESS AIMED TO IDENTIFY THE MAJOR TECHNOLOGICAL CHANGES IN THE NEXT 10 YEARS, TO DETERMINE THE EXTENT AND RATE OF DIFFUSION OF THESE CHANGES, AND TO…
ERIC Educational Resources Information Center
Smith, Robert A.; Pontiggia, Laura; Waterman, Carrie; Lichtenwalner, Meghan
2010-01-01
This paper is based upon experiments developed as part of a Directed Research course designed to provide undergraduate biology students experience in the principles and processes of the scientific method used in biological research. The project involved the evaluation of herbal remedies used in many parts of the world in the treatment of diseases…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.
2008-09-01
The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less
Design, development, testing and validation of a Photonics Virtual Laboratory for the study of LEDs
NASA Astrophysics Data System (ADS)
Naranjo, Francisco L.; Martínez, Guadalupe; Pérez, Ángel L.; Pardo, Pedro J.
2014-07-01
This work presents the design, development, testing and validation of a Photonic Virtual Laboratory, highlighting the study of LEDs. The study was conducted from a conceptual, experimental and didactic standpoint, using e-learning and m-learning platforms. Specifically, teaching tools that help ensure that our students perform significant learning have been developed. It has been brought together the scientific aspect, such as the study of LEDs, with techniques of generation and transfer of knowledge through the selection, hierarchization and structuring of information using concept maps. For the validation of the didactic materials developed, it has been used procedures with various assessment tools for the collection and processing of data, applied in the context of an experimental design. Additionally, it was performed a statistical analysis to determine the validity of the materials developed. The assessment has been designed to validate the contributions of the new materials developed over the traditional method of teaching, and to quantify the learning achieved by students, in order to draw conclusions that serve as a reference for its application in the teaching and learning processes, and comprehensively validate the work carried out.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandina N. Rao; Subhash C. Ayirala; Madhav M. Kulkarni
This report describes the progress of the project ''Development and Optimization of Gas-Assisted Gravity Drainage (GAGD) Process for Improved Light Oil Recovery'' for the duration of the second project year (October 1, 2003--September 30, 2004). There are three main tasks in this research project. Task 1 is scaled physical model study of GAGD process. Task 2 is further development of vanishing interfacial tension (VIT) technique for miscibility determination. Task 3 is determination of multiphase displacement characteristics in reservoir rocks. In Section I, preliminary design of the scaled physical model using the dimensional similarity approach has been presented. Scaled experiments onmore » the current physical model have been designed to investigate the effect of Bond and capillary numbers on GAGD oil recovery. Experimental plan to study the effect of spreading coefficient and reservoir heterogeneity has been presented. Results from the GAGD experiments to study the effect of operating mode, Bond number and capillary number on GAGD oil recovery have been reported. These experiments suggest that the type of the gas does not affect the performance of GAGD in immiscible mode. The cumulative oil recovery has been observed to vary exponentially with Bond and capillary numbers, for the experiments presented in this report. A predictive model using the bundle of capillary tube approach has been developed to predict the performance of free gravity drainage process. In Section II, a mechanistic Parachor model has been proposed for improved prediction of IFT as well as to characterize the mass transfer effects for miscibility development in reservoir crude oil-solvent systems. Sensitivity studies on model results indicate that provision of a single IFT measurement in the proposed model is sufficient for reasonable IFT predictions. An attempt has been made to correlate the exponent (n) in the mechanistic model with normalized solute compositions present in both fluid phases. IFT measurements were carried out in a standard ternary liquid system of benzene, ethanol and water using drop shape analysis and capillary rise techniques. The experimental results indicate strong correlation among the three thermodynamic properties solubility, miscibility and IFT. The miscibility determined from IFT measurements for this ternary liquid system is in good agreement with phase diagram and solubility data, which clearly indicates the sound conceptual basis of VIT technique to determine fluid-fluid miscibility. Model fluid systems have been identified for VIT experimentation at elevated pressures and temperatures. Section III comprises of the experimental study aimed at evaluating the multiphase displacement characteristics of the various gas injection EOR process performances using Berea sandstone cores. During this reporting period, extensive literature review was completed to: (1) study the gravity drainage concepts, (2) identify the various factors influencing gravity stable gas injection processes, (3) identify various multiphase mechanisms and fluid dynamics operative during the GAGD process, and (4) identify important dimensionless groups governing the GAGD process performance. Furthermore, the dimensional analysis of the GAGD process, using Buckingham-Pi theorem to isolate the various dimensionless groups, as well as experimental design based on these dimensionless quantities have been completed in this reporting period. On the experimental front, recommendations from previous WAG and CGI have been used to modify the experimental protocol. This report also includes results from scaled preliminary GAGD displacements as well as the details of the planned GAGD corefloods for the next quarter. The technology transfer activities have mainly consisted of preparing technical papers, progress reports and discussions with industry personnel for possible GAGD field tests.« less
Robichaud, Guillaume; Dixon, R. Brent; Potturi, Amarnatha S.; Cassidy, Dan; Edwards, Jack R.; Sohn, Alex; Dow, Thomas A.; Muddiman, David C.
2010-01-01
Through a multi-disciplinary approach, the air amplifier is being evolved as a highly engineered device to improve detection limits of biomolecules when using electrospray ionization. Several key aspects have driven the modifications to the device through experimentation and simulations. We have developed a computer simulation that accurately portrays actual conditions and the results from these simulations are corroborated by the experimental data. These computer simulations can be used to predict outcomes from future designs resulting in a design process that is efficient in terms of financial cost and time. We have fabricated a new device with annular gap control over a range of 50 to 70 μm using piezoelectric actuators. This has enabled us to obtain better aerodynamic performance when compared to the previous design (2× more vacuum) and also more reproducible results. This is allowing us to study a broader experimental space than the previous design which is critical in guiding future directions. This work also presents and explains the principles behind a fractional factorial design of experiments methodology for testing a large number of experimental parameters in an orderly and efficient manner to understand and optimize the critical parameters that lead to obtain improved detection limits while minimizing the number of experiments performed. Preliminary results showed that several folds of improvements could be obtained for certain condition of operations (up to 34 folds). PMID:21499524
Kerr, Kathleen F; Serikawa, Kyle A; Wei, Caimiao; Peters, Mette A; Bumgarner, Roger E
2007-01-01
The reference design is a practical and popular choice for microarray studies using two-color platforms. In the reference design, the reference RNA uses half of all array resources, leading investigators to ask: What is the best reference RNA? We propose a novel method for evaluating reference RNAs and present the results of an experiment that was specially designed to evaluate three common choices of reference RNA. We found no compelling evidence in favor of any particular reference. In particular, a commercial reference showed no advantage in our data. Our experimental design also enabled a new way to test the effectiveness of pre-processing methods for two-color arrays. Our results favor using intensity normalization and foregoing background subtraction. Finally, we evaluate the sensitivity and specificity of data quality filters, and we propose a new filter that can be applied to any experimental design and does not rely on replicate hybridizations.
Experimental Sea Slicks in the Marsen (Maritime Remote Sensing) Exercise.
1980-10-30
Experimental slicks with various surface properties were generated in the North Sea as part of the MARSEN (Maritime Remote Sensing ) exercise. The one...with remote sensing instrumentation. Because of the numerous effects of surface films on air-sea interfacial processes, these experiments were designed...information was obtained on the influence of sea surface films on the interpretation of signals received by remote sensing systems. Criteria for the
NASA Astrophysics Data System (ADS)
Reza, M.; Ibrahim, M.; Rahayu, Y. S.
2018-01-01
This research aims to develop problem-based learning oriented teaching materials to improve students’ mastery of concept and critical thinking skill. Its procedure was divided into two phases; developmental phase and experimental phase. This developmental research used Four-D Model. However, within this research, the process of development would not involve the last stages, which is disseminate. The teaching learning materials which were developed consist of lesson plan, student handbook, student worksheet, achievement test and critical thinking skill test. The experimental phase employs a research design called one group pretest-posttest design. Results show that the validity of the teaching materials which were developed was good and revealed the enhancement of students’ activities with positive response to the teaching learning process. Furthermore, the learning materials improve the students’ mastery of concept and critical thinking skill.
Project Hope: changing care delivery for the substance abuse patient.
Swenson-Britt, E; Carrougher, G; Martin, B W; Brackley, M
2000-03-01
Project Hope is a program designed to assist healthcare providers in the assessment, care, referral, and follow-up of the hospitalized substance abuse patient. First implemented in 1990 at what is now called University Hospital in San Antonio, Texas, the program has influenced care in a positive way through change in the attitude and knowledge of personnel, administrators, and community. In this paper, the authors provide an overview of the approaches utilized, improvement process, and outcomes obtained from this project. To formally evaluate the effectiveness of Project Hope, a quasi-experimental, Solomon-Four design study was conducted. Eighty nurses from various educational backgrounds and experience with alcohol were divided into groups by nursing unit. A normative-reeducative intervention was applied as described by Chin and Benne. Test of cognition showed significant change (p < .01) in the experimental group; no significance was found for attitudes change. Reasons for these findings and lessons learned from the process are described.
Gaussian process surrogates for failure detection: A Bayesian experimental design approach
NASA Astrophysics Data System (ADS)
Wang, Hongqiao; Lin, Guang; Li, Jinglai
2016-05-01
An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.
Development of a new continuous process for mixing of complex non-Newtonian fluids
NASA Astrophysics Data System (ADS)
Migliozzi, Simona; Mazzei, Luca; Sochon, Bob; Angeli, Panagiota; Thames Multiphase Team; Coral Project Collaboration
2017-11-01
Design of new continuous mixing operations poses many challenges, especially when dealing with highly viscous non-Newtonian fluids. Knowledge of complex rheological behaviour of the working mixture is crucial for development of an efficient process. In this work, we investigate the mixing performance of two different static mixers and the effects of the mixture rheology on the manufacturing of novel non-aqueous-based oral care products using experimental and computational fluid dynamic methods. The two liquid phases employed, i.e. a carbomer suspension in polyethylene glycol and glycerol, start to form a gel when they mix. We studied the structure evolution of the liquid mixture using time-resolved rheometry and we obtained viscosity rheograms at different phase ratios from pressure drop measurements in a customized mini-channel. The numerical results and rheological model were validated with experimental measurements carried out in a specifically designed setup. EPSRS-CORAL.
Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.
Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B
2012-09-15
Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.
Linguistic attention control: attention shifting governed by grammaticized elements of language.
Taube-Schiff, Marlene; Segalowitz, Norman
2005-05-01
In 2 experiments, the authors investigated attention control for tasks involving the processing of grammaticized linguistic stimuli (function words) contextualized in sentence fragments. Attention control was operationalized as shift costs obtained with adult speakers of English in an alternating-runs experimental design (R. D. Rogers & S. Monsell, 1995). Experiment 1 yielded significant attention shift costs between tasks involving judgments about the meanings of grammatical function words. The authors used a 3-stage experimental design (G. Wylie & A. Allport, 2000), and the emerging pattern of results implicated task set reconfiguration and not task set inertia in these shift costs. Experiment 2 further demonstrated that shift costs were lower when the tasks involved shared attentional resources (processing the same grammatical dimension) versus unshared resources (different grammatical dimensions). The authors discuss the results from a cognitive linguistic perspective and for their implications for the view that language itself can serve a special attention-directing function.
Metrological aspects of enzyme production
NASA Astrophysics Data System (ADS)
Kerber, T. M.; Dellamora-Ortiz, G. M.; Pereira-Meirelles, F. V.
2010-05-01
Enzymes are frequently used in biotechnology to carry out specific biological reactions, either in industrial processes or for the production of bioproducts and drugs. Microbial lipases are an important group of biotechnologically valuable enzymes that present widely diversified applications. Lipase production by microorganisms is described in several published papers; however, none of them refer to metrological evaluation and the estimation of the uncertainty in measurement. Moreover, few of them refer to process optimization through experimental design. The objectives of this work were to enhance lipase production in shaken-flasks with Yarrowia lipolytica cells employing experimental design and to evaluate the uncertainty in measurement of lipase activity. The highest lipolytic activity obtained was about three- and fivefold higher than the reported activities of CRMs BCR-693 and BCR-694, respectively. Lipase production by Y. lipolytica cells aiming the classification as certified reference material is recommended after further purification and stability studies.
Experimental development of processes to produce homogenized alloys of immiscible metals, phase 3
NASA Technical Reports Server (NTRS)
Reger, J. L.
1976-01-01
An experimental drop tower package was designed and built for use in a drop tower. This effort consisted of a thermal analysis, container/heater fabrication, and assembly of an expulsion device for rapid quenching of heated specimens during low gravity conditions. Six gallium bismuth specimens with compositions in the immiscibility region (50 a/o of each element) were processed in the experimental package: four during low gravity conditions and two under a one gravity environment. One of the one gravity processed specimens did not have telemetry data and was subsequently deleted for analysis since the processing conditions were not known. Metallurgical, Hall effect, resistivity, and superconductivity examinations were performed on the five specimens. Examination of the specimens showed that the gallium was dispersed in the bismuth. The low gravity processed specimens showed a relatively uniform distribution of gallium, with particle sizes of 1 micrometer or less, in contrast to the one gravity control specimen. Comparison of the cooling rates of the dropped specimens versus microstructure indicated that low cooling rates are more desirable.
NASA Astrophysics Data System (ADS)
Li, N.; Mohamed, M. S.; Cai, J.; Lin, J.; Balint, D.; Dean, T. A.
2011-05-01
Formability of steel and aluminium alloys in hot stamping and cold die quenching processes is studied in this research. Viscoplastic-damage constitutive equations are developed and determined from experimental data for the prediction of viscoplastic flow and ductility of the materials. The determined unified constitutive equations are then implemented into the commercial Finite Element code Abaqus/Explicit via a user defined subroutine, VUMAT. An FE process simulation model and numerical procedures are established for the modeling of hot stamping processes for a spherical part with a central hole. Different failure modes (failure takes place either near the central hole or in the mid span of the part) are obtained. To validate the simulation results, a test programme is developed, a test die set has been designed and manufactured, and tests have been carried out for the materials with different forming rates. It has been found that very close agreements between experimental and numerical process simulation results are obtained for the ranges of temperatures and forming rates carried out.
Clima, Lilia; Ursu, Elena L; Cojocaru, Corneliu; Rotaru, Alexandru; Barboiu, Mihail; Pinteala, Mariana
2015-09-28
The complexes formed by DNA and polycations have received great attention owing to their potential application in gene therapy. In this study, the binding efficiency between double-stranded oligonucleotides (dsDNA) and branched polyethylenimine (B-PEI) has been quantified by processing of the images captured from the gel electrophoresis assays. The central composite experimental design has been employed to investigate the effects of controllable factors on the binding efficiency. On the basis of experimental data and the response surface methodology, a multivariate regression model has been constructed and statistically validated. The model has enabled us to predict the binding efficiency depending on experimental factors, such as concentrations of dsDNA and B-PEI as well as the initial pH of solution. The optimization of the binding process has been performed using simplex and gradient methods. The optimal conditions determined for polyplex formation have yielded a maximal binding efficiency close to 100%. In order to reveal the mechanism of complex formation at the atomic-scale, a molecular dynamic simulation has been carried out. According to the computation results, B-PEI amine hydrogen atoms have interacted with oxygen atoms from dsDNA phosphate groups. These interactions have led to the formation of hydrogen bonds between macromolecules, stabilizing the polyplex structure.
Shao, Jing-Yuan; Qu, Hai-Bin; Gong, Xing-Chu
2018-05-01
In this work, two algorithms (overlapping method and the probability-based method) for design space calculation were compared by using the data collected from extraction process of Codonopsis Radix as an example. In the probability-based method, experimental error was simulated to calculate the probability of reaching the standard. The effects of several parameters on the calculated design space were studied, including simulation number, step length, and the acceptable probability threshold. For the extraction process of Codonopsis Radix, 10 000 times of simulation and 0.02 for the calculation step length can lead to a satisfactory design space. In general, the overlapping method is easy to understand, and can be realized by several kinds of commercial software without coding programs, but the reliability of the process evaluation indexes when operating in the design space is not indicated. Probability-based method is complex in calculation, but can provide the reliability to ensure that the process indexes can reach the standard within the acceptable probability threshold. In addition, there is no probability mutation in the edge of design space by probability-based method. Therefore, probability-based method is recommended for design space calculation. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Katin, Viktor; Kosygin, Vladimir; Akhtiamov, Midkhat
2017-10-01
This paper substantiates the method of mathematical planning for experimental research in the process of selecting the most efficient types of burning devices for tubular refinery furnaces of vertical-cylindrical design. This paper provides detailed consideration of an experimental plan of a 4×4 Latin square type when studying the impact of three factors with four levels of variance. On the basis of the experimental research we have developed practical recommendations on the employment of optimal burners for two-step fuel combustion.
NASA Technical Reports Server (NTRS)
Knox, J.; Fulda, P.; Howard, D.; Ritter, J.; Levan, M.
2007-01-01
The design and testing of a vacuum-swing adsorption process to remove metabolic 'water and carbon dioxide gases from NASA's Orion crew exploration vehicle atmosphere is presented. For the Orion spacecraft, the sorbent-based atmosphere revitalization (SBAR) system must remove all metabolic water, a technology approach 1Lhathas not been used in previous spacecraft life support systems. Design and testing of a prototype SBAR in sub-scale and full-scale configurations is discussed. Experimental and analytical investigations of dual-ended and single-ended vacuum desorption are presented. An experimental investigation of thermal linking between adsorbing and desorbing columns is also presented.
Design and experimental investigations on a small scale traveling wave thermoacoustic engine
NASA Astrophysics Data System (ADS)
Chen, M.; Ju, Y. L.
2013-02-01
A small scale traveling wave or Stirling thermoacoustic engine with a resonator of only 1 m length was designed, constructed and tested by using nitrogen as working gas. The small heat engine achieved a steady working frequency of 45 Hz. The pressure ratio reached 1.189, with an average charge pressure of 0.53 MPa and a heating power of 1.14 kW. The temperature and the pressure characteristics during the onset and damping processes were also observed and discussed. The experimental results demonstrated that the small engine possessed the potential to drive a Stirling-type pulse tube cryocooler.
Quiet Clean Short-haul Experimental Engine (QCSEE). Composite fan frame subsystem test report
NASA Technical Reports Server (NTRS)
Stotler, C. L., Jr.; Bowden, J. H.
1977-01-01
The element and subcomponent testing conducted to verify the composite fan frame design of two experimental high bypass geared turbofan engines and propulsion systems for short haul passenger aircraft is described. Emphasis is placed on the propulsion technology required for future externally blown flap aircraft with engines located both under the wing and over the wing, including technology in composite structures and digital engine controls. The element tests confirmed that the processes used in the frame design would produce the predicted mechanical properties. The subcomponent tests verified that the detail structural components of the frame had adequate structural integrity.
A mobile robots experimental environment with event-based wireless communication.
Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián
2013-07-22
An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented.
NASA Astrophysics Data System (ADS)
Ashat, Ali; Pratama, Heru Berian
2017-12-01
The successful Ciwidey-Patuha geothermal field size assessment required integration data analysis of all aspects to determined optimum capacity to be installed. Resources assessment involve significant uncertainty of subsurface information and multiple development scenarios from these field. Therefore, this paper applied the application of experimental design approach to the geothermal numerical simulation of Ciwidey-Patuha to generate probabilistic resource assessment result. This process assesses the impact of evaluated parameters affecting resources and interacting between these parameters. This methodology have been successfully estimated the maximum resources with polynomial function covering the entire range of possible values of important reservoir parameters.
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
Verma, Arjun; Fratto, Brian E.; Privman, Vladimir; Katz, Evgeny
2016-01-01
We consider flow systems that have been utilized for small-scale biomolecular computing and digital signal processing in binary-operating biosensors. Signal measurement is optimized by designing a flow-reversal cuvette and analyzing the experimental data to theoretically extract the pulse shape, as well as reveal the level of noise it possesses. Noise reduction is then carried out numerically. We conclude that this can be accomplished physically via the addition of properly designed well-mixing flow-reversal cell(s) as an integral part of the flow system. This approach should enable improved networking capabilities and potentially not only digital but analog signal-processing in such systems. Possible applications in complex biocomputing networks and various sense-and-act systems are discussed. PMID:27399702
Antonelli, Raissa; de Araújo, Karla Santos; Pires, Ricardo Francisco; Fornazari, Ana Luiza de Toledo; Granato, Ana Claudia; Malpass, Geoffroy Roger Pointer
2017-10-28
The present paper presents the study of (1) the optimization of electrochemical-free chlorine production using an experimental design approach, and (2) the application of the optimum conditions obtained for the application in photo-assisted electrochemical degradation of simulated textile effluent. In the experimental design the influence of inter-electrode gap, pH, NaCl concentration and current was considered. It was observed that the four variables studied are significant for the process, with NaCl concentration and current being the most significant variables for free chlorine production. The maximum free chlorine production was obtained at a current of 2.33 A and NaCl concentrations in 0.96 mol dm -3 . The application of the optimized conditions with simultaneous UV irradiation resulted in up to 83.1% Total Organic Carbon removal and 100% of colour removal over 180 min of electrolysis. The results indicate that a systematic (statistical) approach to the electrochemical treatment of pollutants can save time and reagents.
Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios
2018-03-02
Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.
El-Naggar, Noura El-Ahmady; El-Shweihy, Nancy M; El-Ewasy, Sara M
2016-09-20
Due to broad range of clinical and industrial applications of cholesterol oxidase, isolation and screening of bacterial strains producing extracellular form of cholesterol oxidase is of great importance. One hundred and thirty actinomycete isolates were screened for their cholesterol oxidase activity. Among them, a potential culture, strain NEAE-42 is displayed the highest extracellular cholesterol oxidase activity. It was selected and identified as Streptomyces cavourensis strain NEAE-42. The optimization of different process parameters for cholesterol oxidase production by Streptomyces cavourensis strain NEAE-42 using Plackett-Burman experimental design and response surface methodology was carried out. Fifteen variables were screened using Plackett-Burman experimental design. Cholesterol, initial pH and (NH4)2SO4 were the most significant positive independent variables affecting cholesterol oxidase production. Central composite design was chosen to elucidate the optimal concentrations of the selected process variables on cholesterol oxidase production. It was found that, cholesterol oxidase production by Streptomyces cavourensis strain NEAE-42 after optimization process was 20.521U/mL which is higher than result obtained from the basal medium before screening process using Plackett-Burman (3.31 U/mL) with a fold of increase 6.19. The cholesterol oxidase level production obtained in this study (20.521U/mL) by the statistical method is higher than many of the reported values.
Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C
2014-01-01
Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.
Activating schoolyards: study design of a quasi-experimental schoolyard intervention study.
Andersen, Henriette Bondo; Pawlowski, Charlotte Skau; Scheller, Hanne Bebendorf; Troelsen, Jens; Toftager, Mette; Schipperijn, Jasper
2015-05-31
The aim of the Activating Schoolyards Study is to develop, implement, document and assess a comprehensive schoolyard intervention to promote physical activity (PA) during school recess for primary school children (grade 4-8). The intervention is designed to implement organizational and structural changes in the physical environment. The study builds on a quasi-experimental study design using a mixed method approach including: 1) an exploratory study aimed at providing input for the developing process; 2) an evaluation of the effect of the interventions using a combination of accelerometer, GPS and GIS; 3) a process evaluation facilitating the intervention development process and identifying barriers and facilitators in the implementation process; 4) a post-intervention end-user evaluation aimed at exploring who uses the schoolyards and how the schoolyards are used. The seven project schools (cases) were selected by means of an open competition and the interventions were developed using a participatory bottom-up approach. The participatory approach and case selection strategy make the study design novel. The use of a mixed methods design including qualitative as well as quantitative methods can be seen as a strength, as the different types of data complement each other and results of one part of the study informed the following parts. A unique aspect of our study is the use of accelerometers in combination with GPS and GIS in the effect evaluation to objectively determine where and how active the students are in the schoolyard, before and after the intervention. This provides a type of data that, to our knowledge, has not been used before in schoolyard interventions. Exploring the change in behavior in relation to specific intervention elements in the schoolyard will lead to recommendations for schools undergoing schoolyard renovations at some point in the future.
Process engineering and scale-up of autotrophic Clostridium strain P11 syngas fermentation
NASA Astrophysics Data System (ADS)
Kundiyana, Dimple Kumar Aiyanna
Scope and Method of Study. Biomass gasification followed by fermentation of syngas to ethanol is a potential process to produce bioenergy. The process is currently being researched under laboratory- and pilot-scale in an effort to optimize the process conditions and make the process feasible for commercial production of ethanol and other biofuels such as butanol and propanol. The broad research objectives for the research were to improve ethanol yields during syngas fermentation and to design a economical fermentation process. The research included four statistically designed experimental studies in serum bottles, bench-scale and pilot-scale fermentors to screen alternate fermentation media components, to determine the effect of process parameters such as pH, temperature and buffer on syngas fermentation, to determine the effect of key limiting nutrients of the acetyl-CoA pathway in a continuous series reactor design, and to scale-up the syngas fermentation in a 100-L pilot scale fermentor. Findings and Conclusions. The first experimental study identified cotton seed extract (CSE) as a feasible medium for Clostridium strain P11 fermentation. The study showed that CSE at 0.5 g L-1 can potentially replace all the standard Clostridium strain P11 fermentation media components while using a media buffer did not significantly improve the ethanol production when used in fermentation with CSE. Scale-up of the CSE fermentation in 2-L and 5-L stirred tank fermentors showed 25% increase in ethanol yield. The second experimental study showed that syngas fermentation at 32°C without buffer was associated with higher ethanol concentration and reduced lag time in switching to solventogenesis. Conducting fermentation at 40°C or by lowering incubation pH to 5.0 resulted in reduced cell growth and no production of ethanol or acetic acid. The third experiment studied the effect of three limiting nutrients, calcium pantothenate, vitamin B12 and CoCl2 on syngas fermentation. Results indicated that it is possible to modulate the product formation by limiting key nutrients of acetyl-CoA pathway and using a continuous fermentation in two-stage fermentor design to improve ethanol yields. The last experimental study was conducted to commission a pilot scale fermentor, and subsequently scale-up the Clostridium strain P11 fermentation from a bench-scale to a pilot scale 100-L fermentor. Results indicated a six-fold improvement in ethanol concentration (25.3 g L-1 at the end of 59 d) compared to previous Clostridium strain P11 and Clostridium carboxidivorans fermentations plus the formation of other compounds such as isopropyl alcohol, acetic acid and butanol, which are of commercial importance.
Experimental design of a twin-column countercurrent gradient purification process.
Steinebach, Fabian; Ulmer, Nicole; Decker, Lara; Aumann, Lars; Morbidelli, Massimo
2017-04-07
As typical for separation processes, single unit batch chromatography exhibits a trade-off between purity and yield. The twin-column MCSGP (multi-column countercurrent solvent gradient purification) process allows alleviating such trade-offs, particularly in the case of difficult separations. In this work an efficient and reliable procedure for the design of the twin-column MCSGP process is developed. This is based on a single batch chromatogram, which is selected as the design chromatogram. The derived MCSGP operation is not intended to provide optimal performance, but it provides the target product in the selected fraction of the batch chromatogram, but with higher yield. The design procedure is illustrated for the isolation of the main charge isoform of a monoclonal antibody from Protein A eluate with ion-exchange chromatography. The main charge isoform was obtained at a purity and yield larger than 90%. At the same time process related impurities such as HCP and leached Protein A as well as aggregates were at least equally well removed. Additionally, the impact of several design parameters on the process performance in terms of purity, yield, productivity and buffer consumption is discussed. The obtained results can be used for further fine-tuning of the process parameters so as to improve its performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Finckenor, M; Byrd-Bredbenner, C
2000-03-01
To develop and evaluate the long-term effectiveness of an intervention program, based on preaction-stage-oriented change processes of the Transtheoretical Model of Behavior Change, that could be delivered in a group setting to help participants lower dietary fat intake. An enhanced version of the nonequivalent control group experimental design was used. Entire sections of an undergraduate introductory nutrition science course were assigned to an experimental, pretest/posttest control, or posttest-only control group. Daily fat intake and stage of change of the experimental and pretest/posttest control groups were determined at the pretest and posttest and 1-year later at a follow-up test. Every 1 to 2 weeks during the study, stage of change of the experimental group was assessed. Daily fat intake of the experimental group was assessed at study midpoint. Daily fat intake and stage of change of the posttest-only control group was determined at the posttest. Pretest results were used to place participants of the experimental and pretest/posttest control groups in either the preaction stage (i.e., precontemplation, contemplation, or preparation) or the action/maintenance stage. The sample consisted of 38, 30, and 42 undergraduate students who were assigned to the experimental, pretest/posttest control, and posttest-only control groups, respectively. The experimental group participated in a group-based, dietary fat intake intervention that included a series of 11 lessons taught over a 14-week period. Each lesson was based on 1 or 2 of the preaction-stage-oriented change processes of the Transtheoretical Model. Data were evaluated to determine the effects of the intervention program on long-term dietary fat reduction and stage of change progression. Analysis of variance, repeated-measures analysis of variance, and paired t tests. For pretest and posttest dietary fat intake scores, stage and time were significant, and there was a significant time-by-stage interaction. Time was significant for pretest and posttest stage scores. Subjects in the preaction-stage experimental group significantly increased their mean stage of change and reduced their fat intake between the pretest and posttest; these changes persisted for 1 year. Pretest/posttest control group participants who began in a preaction stage also significantly increased their mean stage and reduced fat intake by the posttest, but these changes did not endure until the follow-up test. This intervention program produced an enduring, significant reduction in mean dietary fat consumption and a significant progression in mean stage of change of subjects in the experimental group who were in the preaction stage. It may be appropriate to design group interventions to use preaction stage processes rather than the more traditionally used action and maintenance stages change processes.
Study into penetration speed during laser cutting of brain tissues.
Yilbas, Z; Sami, M; Patiroglu, T
1998-01-01
The applications of CO2 continuous-wave lasers in neurosurgery have become important in recent years. Theoretical considerations of laser applicability in medicine are subsequently confirmed experimentally. To obtain precision operation in the laser cutting process, further theoretical developments and experimental studies need to be conducted. Consequently, in the present study, the heat transfer mechanism taking place during laser-tissue interaction is introduced using Fourier theory. The results obtained from the theoretical model are compared with the experimental results. In connection with this, an experiment is designed to measure the penetration speed during the laser cutting process. The measurement is carried out using an optical method. It is found that both results for the penetration speed obtained from the theory and experiment are in a good agreement.
Roosta, M; Ghaedi, M; Shokri, N; Daneshfar, A; Sahraei, R; Asghari, A
2014-01-24
The present study was aimed to experimental design optimization applied to removal of malachite green (MG) from aqueous solution by ultrasound-assisted removal onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as FESEM, TEM, BET, and UV-vis measurements. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time on MG removal were studied using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Kinetic models such as pseudo -first order, pseudo-second order, Elovich and intraparticle diffusion models applicability was tested for experimental data and the second-order equation and intraparticle diffusion models control the kinetic of the adsorption process. The small amount of proposed adsorbent (0.015 g) is applicable for successful removal of MG (RE>99%) in short time (4.4 min) with high adsorption capacity (140-172 mg g(-1)). Copyright © 2013. Published by Elsevier B.V.
Life on rock. Scaling down biological weathering in a new experimental design at Biosphere-2
NASA Astrophysics Data System (ADS)
Zaharescu, D. G.; Dontsova, K.; Burghelea, C. I.; Chorover, J.; Maier, R.; Perdrial, J. N.
2012-12-01
Biological colonization and weathering of bedrock on Earth is a major driver of landscape and ecosystem development, its effects reaching out into other major systems such climate and geochemical cycles of elements. In order to understand how microbe-plant-mycorrhizae communities interact with bedrock in the first phases of mineral weathering we developed a novel experimental design in the Desert Biome at Biosphere-2, University of Arizona (U.S.A). This presentation will focus on the development of the experimental setup. Briefly, six enclosed modules were designed to hold 288 experimental columns that will accommodate 4 rock types and 6 biological treatments. Each module is developed on 3 levels. A lower volume, able to withstand the weight of both, rock material and the rest of the structure, accommodates the sampling elements. A middle volume, houses the experimental columns in a dark chamber. A clear, upper section forms the habitat exposed to sunlight. This volume is completely sealed form exterior and it allows a complete control of its air and water parameters. All modules are connected in parallel with a double air purification system that delivers a permanent air flow. This setup is expected to provide a model experiment, able to test important processes in the interaction rock-life at grain-to- molecular scale.
Experimental results of active control on a large structure to suppress vibration
NASA Technical Reports Server (NTRS)
Dunn, H. J.
1991-01-01
Three design methods, Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR), H-infinity, and mu-synthesis, are used to obtain compensators for suppressing the vibrations of a 10-bay vertical truss structure, a component typical of what may be used to build a large space structure. For the design process the plant dynamic characteristics of the structure were determined experimentally using an identification method. The resulting compensators were implemented on a digital computer and tested for their ability to suppress the first bending mode response of the 10-bay vertical truss. Time histories of the measured motion are presented, and modal damping obtained during the experiments are compared with analytical predictions. The advantages and disadvantages of using the various design methods are discussed.
Cognitive search model and a new query paradigm
NASA Astrophysics Data System (ADS)
Xu, Zhonghui
2001-06-01
This paper proposes a cognitive model in which people begin to search pictures by using semantic content and find a right picture by judging whether its visual content is a proper visualization of the semantics desired. It is essential that human search is not just a process of matching computation on visual feature but rather a process of visualization of the semantic content known. For people to search electronic images in the way as they manually do in the model, we suggest that querying be a semantic-driven process like design. A query-by-design paradigm is prosed in the sense that what you design is what you find. Unlike query-by-example, query-by-design allows users to specify the semantic content through an iterative and incremental interaction process so that a retrieval can start with association and identification of the given semantic content and get refined while further visual cues are available. An experimental image retrieval system, Kuafu, has been under development using the query-by-design paradigm and an iconic language is adopted.
Teżyk, Michał; Jakubowska, Emilia; Milanowski, Bartłomiej; Lulek, Janina
2017-10-01
The aim of this study was to optimize the process of tablets compression and identification of film-coating critical process parameters (CPPs) affecting critical quality attributes (CQAs) using quality by design (QbD) approach. Design of experiment (DOE) and regression methods were employed to investigate hardness, disintegration time, and thickness of uncoated tablets depending on slugging and tableting compression force (CPPs). Plackett-Burman experimental design was applied to identify critical coating process parameters among selected ones that is: drying and preheating time, atomization air pressure, spray rate, air volume, inlet air temperature, and drum pressure that may influence the hardness and disintegration time of coated tablets. As a result of the research, design space was established to facilitate an in-depth understanding of existing relationship between CPPs and CQAs of intermediate product (uncoated tablets). Screening revealed that spray rate and inlet air temperature are two most important factors that affect the hardness of coated tablets. Simultaneously, none of the tested coating factors have influence on disintegration time. The observation was confirmed by conducting film coating of pilot size batches.
Zakrzewska-Koltuniewicz, Grażyna; Herdzik-Koniecko, Irena; Cojocaru, Corneliu; Chajduk, Ewelina
2014-06-30
The paper deals with experimental design and optimization of leaching process of uranium and associated metals from low-grade, Polish ores. The chemical elements of interest for extraction from the ore were U, La, V, Mo, Yb and Th. Sulphuric acid has been used as leaching reagent. Based on the design of experiments the second-order regression models have been constructed to approximate the leaching efficiency of elements. The graphical illustrations using 3-D surface plots have been employed in order to identify the main, quadratic and interaction effects of the factors. The multi-objective optimization method based on desirability approach has been applied in this study. The optimum condition have been determined as P=5 bar, T=120 °C and t=90 min. Under these optimal conditions, the overall extraction performance is 81.43% (for U), 64.24% (for La), 98.38% (for V), 43.69% (for Yb) and 76.89% (for Mo) and 97.00% (for Th). Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Keen, Jill M.; Evans, Kurt B.; Schiffman, Robert L.; Deweese, C. Darrell; Prince, Michael E.
1995-01-01
Experimental design testing was conducted to identify critical parameters of an aqueous spray process intended for cleaning solid rocket motor metal components (steel and aluminum). A two-level, six-parameter, fractional factorial matrix was constructed and conducted for two cleaners, Brulin 815 GD and Diversey Jettacin. The matrix parameters included cleaner temperature and concentration, wash density, wash pressure, rinse pressure, and dishwasher type. Other spray parameters: nozzle stand-off, rinse water temperature, wash and rinse time, dry conditions, and type of rinse water (deionized) were held constant. Matrix response testing utilized discriminating bond specimens (fracture energy and tensile adhesion strength) which represent critical production bond lines. Overall, Jettacin spray cleaning was insensitive to the range of conditions tested for all parameters and exhibited bond strengths significantly above the TCA test baseline for all bond lines tested. Brulin 815 was sensitive to cleaning temperature, but produced bond strengths above the TCA test baseline even at the lower temperatures. Ultimately, the experimental design database was utilized to recommend process parameter settings for future aqueous spray cleaning characterization work.
Factors that influence the tribocharging of pulverulent materials in compressed-air devices
NASA Astrophysics Data System (ADS)
Das, S.; Medles, K.; Mihalcioiu, A.; Beleca, R.; Dragan, C.; Dascalescu, L.
2008-12-01
Tribocharging of pulverulent materials in compressed-air devices is a typical multi-factorial process. This paper aims at demonstrating the interest of using the design of experiments methodology in association with virtual instrumentation for quantifying the effects of various process varaibles and of their interactions, as a prerequisite for the development of new tribocharging devices for industrial applications. The study is focused on the tribocharging of PVC powders in compressed-air devices similar to those employed in electrostatic painting. A classical 2 full-factorial design (3 factors at two levels) was employed for conducting the experiments. The response function was the charge/mass ratio of the material collected in a modified Faraday cage, at the exit of the tribocharging device. The charge/mass ratio was found to increase with the injection pressure and the vortex pressure in the tribocharging device, and to decrease with the increasing of the feed rate. In the present study an in-house design of experiments software was employed for statistical analysis of experimental data and validation of the experimental model.
Finite element simulation and Experimental verification of Incremental Sheet metal Forming
NASA Astrophysics Data System (ADS)
Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr
2018-04-01
Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.
NASA Astrophysics Data System (ADS)
Taschuk, M. T.; Tucker, R. T.; LaForge, J. M.; Beaudry, A. L.; Kupsta, M. R.; Brett, M. J.
2013-12-01
The vapour-liquid-solid glancing angle deposition (VLS-GLAD) process is capable of producing complex nanotree structures with control over azimuthal branch orientation and height. We have developed a thin film growth simulation including ballistic deposition, simplified surface diffusion, and droplet-mediated cubic crystal growth for the VLS-GLAD process using the UnrealTM Development Kit. The use of a commercial game engine has provided an interactive environment while allowing a custom physics implementation. Our simulation's output is verified against experimental data, including a volumetric film reconstruction produced using focused ion beam and scanning-electron microscopy (SEM), crystallographic texture, and morphological characteristics such as branch orientation. We achieve excellent morphological and texture agreement with experimental data, as well as qualitative agreement with SEM imagery. The simplified physics in our model reproduces the experimental films, indicating that the dominant role flux geometry plays in the VLS-GLAD competitive growth process responsible for azimuthally oriented branches and biaxial crystal texture evolution. The simulation's successful reproduction of experimental data indicates that it should have predictive power in designing novel VLS-GLAD structures.
True and Quasi-Experimental Designs. ERIC/AE Digest.
ERIC Educational Resources Information Center
Gribbons, Barry; Herman, Joan
Among the different types of experimental design are two general categories: true experimental designs and quasi- experimental designs. True experimental designs include more than one purposively created group, common measured outcomes, and random assignment. Quasi-experimental designs are commonly used when random assignment is not practical or…
De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G
2007-11-01
The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us to understand the process.
Supersonic, nonlinear, attached-flow wing design for high lift with experimental validation
NASA Technical Reports Server (NTRS)
Pittman, J. L.; Miller, D. S.; Mason, W. H.
1984-01-01
Results of the experimental validation are presented for the three dimensional cambered wing which was designed to achieve attached supercritical cross flow for lifting conditions typical of supersonic maneuver. The design point was a lift coefficient of 0.4 at Mach 1.62 and 12 deg angle of attack. Results from the nonlinear full potential method are presented to show the validity of the design process along with results from linear theory codes. Longitudinal force and moment data and static pressure data were obtained in the Langley Unitary Plan Wind Tunnel at Mach numbers of 1.58, 1.62, 1.66, 1.70, and 2.00 over an angle of attack range of 0 to 14 deg at a Reynolds number of 2.0 x 10 to the 6th power per foot. Oil flow photographs of the upper surface were obtained at M = 1.62 for alpha approx. = 8, 10, 12, and 14 deg.
Liu, Guohai; Yang, Junqin; Chen, Ming; Chen, Qian
2014-01-01
A fault-tolerant permanent-magnet vernier (FT-PMV) machine is designed for direct-drive applications, incorporating the merits of high torque density and high reliability. Based on the so-called magnetic gearing effect, PMV machines have the ability of high torque density by introducing the flux-modulation poles (FMPs). This paper investigates the fault-tolerant characteristic of PMV machines and provides a design method, which is able to not only meet the fault-tolerant requirements but also keep the ability of high torque density. The operation principle of the proposed machine has been analyzed. The design process and optimization are presented specifically, such as the combination of slots and poles, the winding distribution, and the dimensions of PMs and teeth. By using the time-stepping finite element method (TS-FEM), the machine performances are evaluated. Finally, the FT-PMV machine is manufactured, and the experimental results are presented to validate the theoretical analysis.
High Level Analysis, Design and Validation of Distributed Mobile Systems with
NASA Astrophysics Data System (ADS)
Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.
System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.
Designing and Implementing a Constructivist Chemistry Laboratory Program.
ERIC Educational Resources Information Center
Blakely, Alan
2000-01-01
Describes a constructivist chemistry laboratory approach based on students' personal experiences where students had the opportunity to develop their own experimental processes. Points out both the fruitfulness and difficulties of using a graduate student as a teaching assistant. (YDS)
Study on processing immiscible materials in zero gravity
NASA Technical Reports Server (NTRS)
Reger, J. L.; Mendelson, R. A.
1975-01-01
An experimental investigation was conducted to evaluate mixing immiscible metal combinations under several process conditions. Under one-gravity, these included thermal processing, thermal plus electromagnetic mixing, and thermal plus acoustic mixing. The same process methods were applied during free fall on the MSFC drop tower facility. The design is included of drop tower apparatus to provide the electromagnetic and acoustic mixing equipment, and a thermal model was prepared to design the specimen and cooling procedure. Materials systems studied were Ca-La, Cd-Ga and Al-Bi; evaluation of the processed samples included the morphology and electronic property measurements. The morphology was developed using optical and scanning electron microscopy and microprobe analyses. Electronic property characterization of the superconducting transition temperatures were made using an impedance change-tuned coil method.
Comparison of optimization algorithms for the slow shot phase in HPDC
NASA Astrophysics Data System (ADS)
Frings, Markus; Berkels, Benjamin; Behr, Marek; Elgeti, Stefanie
2018-05-01
High-pressure die casting (HPDC) is a popular manufacturing process for aluminum processing. The slow shot phase in HPDC is the first phase of this process. During this phase, the molten metal is pushed towards the cavity under moderate plunger movement. The so-called shot curve describes this plunger movement. A good design of the shot curve is important to produce high-quality cast parts. Three partially competing process goals characterize the slow shot phase: (1) reducing air entrapment, (2) avoiding temperature loss, and (3) minimizing oxide caused by the air-aluminum contact. Due to the rough process conditions with high pressure and temperature, it is hard to design the shot curve experimentally. There exist a few design rules that are based on theoretical considerations. Nevertheless, the quality of the shot curve design still depends on the experience of the machine operator. To improve the shot curve it seems to be natural to use numerical optimization. This work compares different optimization strategies for the slow shot phase optimization. The aim is to find the best optimization approach on a simple test problem.
Experimental design and statistical methods for improved hit detection in high-throughput screening.
Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert
2010-09-01
Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.
Applied Integrated Design in Composite UAV Development
NASA Astrophysics Data System (ADS)
Vasić, Zoran; Maksimović, Stevan; Georgijević, Dragutin
2018-04-01
This paper presents a modern approach to integrated development of Unmanned Aerial Vehicle made of laminated composite materials from conceptual design, through detail design, strength and stiffness analyses, definition and management of design and production data, detailed tests results and other activities related to development of laminated composite structures with main of its particularities in comparison to metal structures. Special attention in this work is focused to management processes of product data during life cycle of an UAV and experimental tests of its composite wing. Experience shows that the automation management processes of product data during life cycle, as well as processes of manufacturing, are inevitable if a company wants to get cheaper and quality composite aircraft structures. One of the most effective ways of successful management of product data today is Product Life cycle Management (PLM). In terms of the PLM, a spectrum of special measures and provisions has to be implemented when defining fiber-reinforced composite material structures in comparison to designing with metals which is elaborated in the paper.
Validation of the FEA of a deep drawing process with additional force transmission
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Bouguecha, A.; Bonk, C.; Grbic, N.; Vucetic, M.
2017-10-01
In order to meet requirements by automotive industry like decreasing the CO2 emissions, which reflects in reducing vehicles mass in the car body, the chassis and the powertrain, the continuous innovation and further development of existing production processes are required. In sheet metal forming processes the process limits and components characteristics are defined through the process specific loads. While exceeding the load limits, a failure in the material occurs, which can be avoided by additional force transmission activated in the deep drawing process before the process limit is achieved. This contribution deals with experimental investigations of a forming process with additional force transmission regarding the extension of the process limits. Based on FEA a tool system is designed and developed by IFUM. For this purpose, the steel material HCT600 is analyzed numerically. Within the experimental investigations, the deep drawing processes, with and without the additional force transmission are carried out. Here, a comparison of the produced rectangle cups is done. Subsequently, the identical deep drawing processes are investigated numerically. Thereby, the values of the punch reaction force and displacement are estimated and compared with experimental results. Thus, the validation of material model is successfully carried out on process scale. For further quantitative verification of the FEA results the experimental determined geometry of the rectangular cup is measured optically with ATOS system of the company GOM mbH and digitally compared with external software Geomagic®QualifyTM. The goal of this paper is the verification of the transferability of the FEA model for a conventional deep drawing process to a deep drawing process with additional force transmission with a counter punch.
Optimization of turning process through the analytic flank wear modelling
NASA Astrophysics Data System (ADS)
Del Prete, A.; Franchi, R.; De Lorenzis, D.
2018-05-01
In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.
Nasri Nasrabadi, Mohammad Reza; Razavi, Seyed Hadi
2010-04-01
In this work, we applied statistical experimental design to a fed-batch process for optimization of tricarboxylic acid cycle (TCA) intermediates in order to achieve high-level production of canthaxanthin from Dietzia natronolimnaea HS-1 cultured in beet molasses. A fractional factorial design (screening test) was first conducted on five TCA cycle intermediates. Out of the five TCA cycle intermediates investigated via screening tests, alfaketoglutarate, oxaloacetate and succinate were selected based on their statistically significant (P<0.05) and positive effects on canthaxanthin production. These significant factors were optimized by means of response surface methodology (RSM) in order to achieve high-level production of canthaxanthin. The experimental results of the RSM were fitted with a second-order polynomial equation by means of a multiple regression technique to identify the relationship between canthaxanthin production and the three TCA cycle intermediates. By means of this statistical design under a fed-batch process, the optimum conditions required to achieve the highest level of canthaxanthin (13172 + or - 25 microg l(-1)) were determined as follows: alfaketoglutarate, 9.69 mM; oxaloacetate, 8.68 mM; succinate, 8.51 mM. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Terzian, Mary A.; Li, Jilan; Fraser, Mark W.; Day, Steven H.; Rose, Roderick A.
2015-01-01
This article describes the findings from an efficacy trial of a school-based, universal prevention program designed to reduce aggressive behavior of by strengthening emotion regulation and social information-processing (SIP) skills. Three cohorts of third graders (N = 479) participated in this study. The first cohort participated in the Making…
Difference among Levels of Inquiry: Process Skills Improvement at Senior High School in Indonesia
ERIC Educational Resources Information Center
Hardianti, Tuti; Kuswanto, Heru
2017-01-01
The objective of the research concerned here was to discover the difference in effectiveness among Levels 2, 3, and 4 of inquiry learning in improving students' process skills. The research was a quasi-experimental study using the pretest-posttest non-equivalent control group research design. Three sample groups were selected by means of cluster…
ERIC Educational Resources Information Center
Durmaz, Hüsnüye
2016-01-01
The aim of this study is to investigate the effects of an instructional intervention on enhancement the pre-service science teachers' (PSTs) science process skills (SPSs) and to identify problems in using SPSs through Laboratory Applications in Science Education-I course (LASE-I). One group pretest-posttest pre-experimental design was employed. An…
Deep Processing of Long-Distance Dependencies in L2 English: The Case of Anaphora
ERIC Educational Resources Information Center
Wang, Yi-Ting
2012-01-01
Since the seminal work of Alan Juffs, second language (L2) sentence processing has become a central research topic in the field of L2 acquisition. Beyond a general hunger for new data offered by new experimental techniques and designs, intellectual concerns with system-wide understanding of first-language (L1)-L2 differences extended to the…
Experimental Designs in Sentence Processing Research: A Methodological Review and User's Guide
ERIC Educational Resources Information Center
Keating, Gregory D.; Jegerski, Jill
2015-01-01
Since the publication of Clahsen and Felser's (2006) keynote article on grammatical processing in language learners, the online study of sentence comprehension in adult second language (L2) learners has quickly grown into a vibrant and prolific subfield of SLA. As online methods begin to establish a foothold in SLA research, it is important…
ERIC Educational Resources Information Center
Davis, Wesley K.
This comparative study evaluated the writing growth of 97 college freshman before and after instruction to determine if a process-centered mode of teaching had a more significant impact than a traditional form-centered mode of instruction on discourse coherence in composition. The study used a pretest/posttest, quasi-experimental design with both…
ERIC Educational Resources Information Center
Özenç, Emine Gül
2016-01-01
The purpose of this study is to find out whether process oriented writing exercises/activities have any effect on the achievement and attitude of preservice teachers as well as to set forth the opinions of primary preservice teachers on process oriented writing approach. In the research one classroom was designated as experimental group (N = 35)…
Design and performance study of an orthopaedic surgery robotized module for automatic bone drilling.
Boiadjiev, George; Kastelov, Rumen; Boiadjiev, Tony; Kotev, Vladimir; Delchev, Kamen; Zagurski, Kazimir; Vitkov, Vladimir
2013-12-01
Many orthopaedic operations involve drilling and tapping before the insertion of screws into a bone. This drilling is usually performed manually, thus introducing many problems. These include attaining a specific drilling accuracy, preventing blood vessels from breaking, and minimizing drill oscillations that would widen the hole. Bone overheating is the most important problem. To avoid such problems and reduce the subjective factor, automated drilling is recommended. Because numerous parameters influence the drilling process, this study examined some experimental methods. These concerned the experimental identification of technical drilling parameters, including the bone resistance force and temperature in the drilling process. During the drilling process, the following parameters were monitored: time, linear velocity, angular velocity, resistance force, penetration depth, and temperature. Specific drilling effects were revealed during the experiments. The accuracy was improved at the starting point of the drilling, and the error for the entire process was less than 0.2 mm. The temperature deviations were kept within tolerable limits. The results of various experiments with different drilling velocities, drill bit diameters, and penetration depths are presented in tables, as well as the curves of the resistance force and temperature with respect to time. Real-time digital indications of the progress of the drilling process are shown. Automatic bone drilling could entirely solve the problems that usually arise during manual drilling. An experimental setup was designed to identify bone drilling parameters such as the resistance force arising from variable bone density, appropriate mechanical drilling torque, linear speed of the drill, and electromechanical characteristics of the motors, drives, and corresponding controllers. Automatic drilling guarantees greater safety for the patient. Moreover, the robot presented is user-friendly because it is simple to set robot tasks, and process data are collected in real time. Copyright © 2013 John Wiley & Sons, Ltd.
Wet scrubbing of biomass producer gas tars using vegetable oil
NASA Astrophysics Data System (ADS)
Bhoi, Prakashbhai Ramabhai
The overall aims of this research study were to generate novel design data and to develop an equilibrium stage-based thermodynamic model of a vegetable oil based wet scrubbing system for the removal of model tar compounds (benzene, toluene and ethylbenzene) found in biomass producer gas. The specific objectives were to design, fabricate and evaluate a vegetable oil based wet scrubbing system and to optimize the design and operating variables; i.e., packed bed height, vegetable oil type, solvent temperature, and solvent flow rate. The experimental wet packed bed scrubbing system includes a liquid distributor specifically designed to distribute a high viscous vegetable oil uniformly and a mixing section, which was designed to generate a desired concentration of tar compounds in a simulated air stream. A method and calibration protocol of gas chromatography/mass spectroscopy was developed to quantify tar compounds. Experimental data were analyzed statistically using analysis of variance (ANOVA) procedure. Statistical analysis showed that both soybean and canola oils are potential solvents, providing comparable removal efficiency of tar compounds. The experimental height equivalent to a theoretical plate (HETP) was determined as 0.11 m for vegetable oil based scrubbing system. Packed bed height and solvent temperature had highly significant effect (p0.05) effect on the removal of model tar compounds. The packing specific constants, Ch and CP,0, for the Billet and Schultes pressure drop correlation were determined as 2.52 and 2.93, respectively. The equilibrium stage based thermodynamic model predicted the removal efficiency of model tar compounds in the range of 1-6%, 1-4% and 1-2% of experimental data for benzene, toluene and ethylbenzene, respectively, for the solvent temperature of 30° C. The NRTL-PR property model and UNIFAC for estimating binary interaction parameters are recommended for modeling absorption of tar compounds in vegetable oils. Bench scale experimental data from the wet scrubbing system would be useful in the design and operation of a pilot scale vegetable oil based system. The process model, validated using experimental data, would be a key design tool for the design and optimization of a pilot scale vegetable oil based system.
TEMPUS: A facility for containerless electromagnetic processing onboard spacelab
NASA Technical Reports Server (NTRS)
Lenski, H.; Willnecker, R.
1990-01-01
The electromagnetic containerless processing facility TEMPUS was recently assigned for a flight on the IML-2 mission. In comparison to the TEMPUS facility already flown on a sounding rocket, several improvements had to be implemented. These are in particular related to: safety; resource management; and the possibility to process different samples with different requirements in one mission. The basic design of this facility as well as the expected processing capabilities are presented. Two operational aspects turned out to strongly influence the facility design: control of the sample motion (first experimental results indicate that crew or ground interaction will be necessary to minimize residual sample motions during processing); and exchange of RF-coils (during processing in vacuum, evaporated sample materials will condense at the cold surface and may force a coil exchange, when a critical thickness is exceeded).
Semisupervised Gaussian Process for Automated Enzyme Search.
Mellor, Joseph; Grigoras, Ioana; Carbonell, Pablo; Faulon, Jean-Loup
2016-06-17
Synthetic biology is today harnessing the design of novel and greener biosynthesis routes for the production of added-value chemicals and natural products. The design of novel pathways often requires a detailed selection of enzyme sequences to import into the chassis at each of the reaction steps. To address such design requirements in an automated way, we present here a tool for exploring the space of enzymatic reactions. Given a reaction and an enzyme the tool provides a probability estimate that the enzyme catalyzes the reaction. Our tool first considers the similarity of a reaction to known biochemical reactions with respect to signatures around their reaction centers. Signatures are defined based on chemical transformation rules by using extended connectivity fingerprint descriptors. A semisupervised Gaussian process model associated with the similar known reactions then provides the probability estimate. The Gaussian process model uses information about both the reaction and the enzyme in providing the estimate. These estimates were validated experimentally by the application of the Gaussian process model to a newly identified metabolite in Escherichia coli in order to search for the enzymes catalyzing its associated reactions. Furthermore, we show with several pathway design examples how such ability to assign probability estimates to enzymatic reactions provides the potential to assist in bioengineering applications, providing experimental validation to our proposed approach. To the best of our knowledge, the proposed approach is the first application of Gaussian processes dealing with biological sequences and chemicals, the use of a semisupervised Gaussian process framework is also novel in the context of machine learning applied to bioinformatics. However, the ability of an enzyme to catalyze a reaction depends on the affinity between the substrates of the reaction and the enzyme. This affinity is generally quantified by the Michaelis constant KM. Therefore, we also demonstrate using Gaussian process regression to predict KM given a substrate-enzyme pair.
ERIC Educational Resources Information Center
Rees, Alan M; Schultz, Douglas G.
An empirical study of the nature and variability of the relevance judgment process was conducted from July 1, 1965 to September 30, 1967. Volume I of the final report presents a literature review and statement of the theoretical framework of the study, a discussion of the experimental design and a summary of data analyses. The study had two…
Diagnostic evaluations of microwave generated helium and nitrogen plasma mixtures
NASA Technical Reports Server (NTRS)
Haraburda, Scott S.; Hawley, Martin C.; Dinkel, Duane W.
1990-01-01
The goal of this work is to continue the development to fundamentally understand the plasma processes as applied to spacecraft propulsion. The diagnostic experiments used are calorimetric, dimensional, and spectroscopic measurements using the TM 011 and TM 012 modes in the resonance cavity. These experimental techniques are highly important in furthering the understanding of plasma phenomena and of designing rocket thrusters. Several experimental results are included using nitrogen and helium gas mixtures.
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
These workbooks are part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process becoming literate or have recently become literate in their native language. They provide readings and exercises for developing literacy skills. Pictures and fill-in-the blank exercises appear frequently. Volume 1…
ERIC Educational Resources Information Center
Liou, Wei-Kai; Bhagat, Kaushal Kumar; Chang, Chun-Yen
2018-01-01
The aim of this study is to design and implement a digital interactive globe system (DIGS), by integrating low-cost equipment to make DIGS cost-effective. DIGS includes a data processing unit, a wireless control unit, an image-capturing unit, a laser emission unit, and a three-dimensional hemispheric body-imaging screen. A quasi-experimental study…
ERIC Educational Resources Information Center
Balcombe, Jonathan P., Comp.
This paper lists 35 studies in biology which can be tailored to suit the full range of student age groups and are designed to involve most or all of the key elements of the scientific process (study design, data collection and presentation, and experimental manipulation). Examples of some studies are: (1) study the growth of molds on food items…
Emerson, Mitchell R; Gallagher, Ryan J; Marquis, Janet G; LeVine, Steven M
2009-01-01
Advancing the understanding of the mechanisms involved in the pathogenesis of multiple sclerosis (MS) likely will lead to new and better therapeutics. Although important information about the disease process has been obtained from research on pathologic specimens, peripheral blood lymphocytes and MRI studies, the elucidation of detailed mechanisms has progressed largely through investigations using animal models of MS. In addition, animal models serve as an important tool for the testing of putative interventions. The most commonly studied model of MS is experimental autoimmune encephalomyelitis (EAE). This model can be induced in a variety of species and by various means, but there has been concern that the model may not accurately reflect the disease process, and more importantly, it may give rise to erroneous findings when it is used to test possible therapeutics. Several reasons have been given to explain the shortcomings of this model as a useful testing platform, but one idea provides a framework for improving the value of this model, and thus, it deserves careful consideration. In particular, the idea asserts that EAE studies are inadequately designed to enable appropriate evaluation of putative therapeutics. Here we discuss problem areas within EAE study designs and provide suggestions for their improvement. This paper is principally directed at investigators new to the field of EAE, although experienced investigators may find useful suggestions herein. PMID:19389303
Methods for processing high-throughput RNA sequencing data.
Ares, Manuel
2014-11-03
High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.
Zeeman, Heidi; Kendall, Elizabeth; Whitty, Jennifer A; Wright, Courtney J; Townsend, Clare; Smith, Dianne; Lakhani, Ali; Kennerley, Samantha
2016-03-15
Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability.
Castorena-Cortés, G; Roldán-Carrillo, T; Zapata-Peñasco, I; Reyes-Avila, J; Quej-Aké, L; Marín-Cruz, J; Olguín-Lora, P
2009-12-01
Microcosm assays and Taguchi experimental design was used to assess the biodegradation of an oil sludge produced by a gas processing unit. The study showed that the biodegradation of the sludge sample is feasible despite the high level of pollutants and complexity involved in the sludge. The physicochemical and microbiological characterization of the sludge revealed a high concentration of hydrocarbons (334,766+/-7001 mg kg(-1) dry matter, d.m.) containing a variety of compounds between 6 and 73 carbon atoms in their structure, whereas the concentration of Fe was 60,000 mg kg(-1) d.m. and 26,800 mg kg(-1) d.m. of sulfide. A Taguchi L(9) experimental design comprising 4 variables and 3 levels moisture, nitrogen source, surfactant concentration and oxidant agent was performed, proving that moisture and nitrogen source are the major variables that affect CO(2) production and total petroleum hydrocarbons (TPH) degradation. The best experimental treatment yielded a TPH removal of 56,092 mg kg(-1) d.m. The treatment was carried out under the following conditions: 70% moisture, no oxidant agent, 0.5% of surfactant and NH(4)Cl as nitrogen source.
Evaluation of Selected Chemical Processes for Production of Low-cost Silicon, Phase 3
NASA Technical Reports Server (NTRS)
Blocher, J. M.; Browning, M. F.
1979-01-01
Refinements of the design of the 50 MT/year Experimental Process System Development Unit were made and competitive bids were received from mechanical, electrical, and structural contractors. Bids on most of the equipment were received and cataloged. Emergency procedures were defined to counter a variety of contingencies disclosed in operations and safety reviews. Experimental work with an electrolytic cell for zinc chloride disclosed no significant increase in power efficiency by steps taken to increase electrolyte circulation. On the basis of materials compatibility and permeability tests, 310 stainless steel was chosen for the shell of the fluidized-bed reactor and SiC-coated graphite for the liner.
Optimization of Nd: YAG Laser Marking of Alumina Ceramic Using RSM And ANN
NASA Astrophysics Data System (ADS)
Peter, Josephine; Doloi, B.; Bhattacharyya, B.
2011-01-01
The present research papers deals with the artificial neural network (ANN) and the response surface methodology (RSM) based mathematical modeling and also an optimization analysis on marking characteristics on alumina ceramic. The experiments have been planned and carried out based on Design of Experiment (DOE). It also analyses the influence of the major laser marking process parameters and the optimal combination of laser marking process parametric setting has been obtained. The output of the RSM optimal data is validated through experimentation and ANN predictive model. A good agreement is observed between the results based on ANN predictive model and actual experimental observations.
NASA Technical Reports Server (NTRS)
Fusaro, Robert L.; Jones, Steven P.; Jansen, Ralph
1996-01-01
A complete evaluation of the tribological characteristics of a given material/mechanical system is a time-consuming operation since the friction and wear process is extremely systems sensitive. As a result, experimental designs (i.e., Latin Square, Taguchi) have been implemented in an attempt to not only reduce the total number of experimental combinations needed to fully characterize a material/mechanical system, but also to acquire life data for a system without having to perform an actual life test. Unfortunately, these experimental designs still require a great deal of experimental testing and the output does not always produce meaningful information. In order to further reduce the amount of experimental testing required, this study employs a computer neural network model to investigate different material/mechanical systems. The work focuses on the modeling of the wear behavior, while showing the feasibility of using neural networks to predict life data. The model is capable of defining which input variables will influence the tribological behavior of the particular material/mechanical system being studied based on the specifications of the overall system.
Probing the effects of surface hydrophobicity and tether orientation on antibody-antigen binding
NASA Astrophysics Data System (ADS)
Bush, Derek B.; Knotts, Thomas A.
2017-04-01
Antibody microarrays have the potential to revolutionize molecular detection for many applications, but their current use is limited by poor reliability, and efforts to change this have not yielded fruitful results. One difficulty which limits the rational engineering of next-generation devices is that little is known, at the molecular level, about the antibody-antigen binding process near solid surfaces. Atomic-level structural information is scant because typical experimental techniques (X-ray crystallography and NMR) cannot be used to image proteins bound to surfaces. To overcome this limitation, this study uses molecular simulation and an advanced, experimentally validated, coarse-grain, protein-surface model to compare fab-lysozyme binding in bulk solution and when the fab is tethered to hydrophobic and hydrophilic surfaces. The results show that the tether site in the fab, as well as the surface hydrophobicity, significantly impacts the binding process and suggests that the optimal design involves tethering fabs upright on a hydrophilic surface. The results offer an unprecedented, molecular-level picture of the binding process and give hope that the rational design of protein-microarrays is possible.
Ti film deposition process of a plasma focus: Study by an experimental design
NASA Astrophysics Data System (ADS)
Inestrosa-Izurieta, M. J.; Moreno, J.; Davis, S.; Soto, L.
2017-10-01
The plasma generated by plasma focus (PF) devices have substantially different physical characteristics from another plasma, energetic ions and electrons, compared with conventional plasma devices used for plasma nanofabrication, offering new and unique opportunities in the processing and synthesis of Nanomaterials. This article presents the use of a plasma focus of tens of joules, PF-50J, for the deposition of materials sprayed from the anode by the plasma dynamics in the axial direction. This work focuses on the determination of the most significant effects of the technological parameters of the system on the obtained depositions through the use of a statistical experimental design. The results allow us to give a qualitative understanding of the Ti film deposition process in our PF device depending on four different events provoked by the plasma dynamics: i) an electric erosion of the outer material of the anode; ii) substrate ablation generating an interlayer; iii) electron beam deposition of material from the center of the anode; iv) heat load provoking clustering or even melting of the deposition surface.
Short Duration Reduced Gravity Drop Tower Design and Development
NASA Astrophysics Data System (ADS)
Osborne, B.; Welch, C.
The industrial and commercial development of space-related activities is intimately linked to the ability to conduct reduced gravity research. Reduced gravity experimentation is important to many diverse fields of research in the understanding of fundamental and applied aspects of physical phenomena. Both terrestrial and extra-terrestrial experimental facilities are currently available to allow researchers access to reduced gravity environments. This paper discusses two drop tower designs, a 2.0 second facility built in Australia and a proposed 2.2 second facility in the United Kingdom. Both drop towers utilise a drag shield for isolating the falling experiment from the drag forces of the air during the test. The design and development of The University of Queensland's (Australia) 2.0 second drop tower, including its specifications and operational procedures is discussed first. Sensitive aspects of the design process are examined. Future plans are then presented for a new short duration (2.2 sec) ground-based reduced gravity drop tower. The new drop tower has been designed for Kingston University (United Kingdom) to support teaching and research in the field of reduced gravity physics. The design has been informed by the previous UQ drop tower design process and utilises a catapult mechanism to increase test time and also incorporates features to allow participants for a variety of backgrounds (from high school students through to university researchers) to learn and experiment in reduced gravity. Operational performance expectations for this new facility are also discussed.