Science.gov

Sample records for community-driven model designed

  1. A community-driven global reconstruction of human metabolism.

    PubMed

    Thiele, Ines; Swainston, Neil; Fleming, Ronan M T; Hoppe, Andreas; Sahoo, Swagatika; Aurich, Maike K; Haraldsdottir, Hulda; Mo, Monica L; Rolfsson, Ottar; Stobbe, Miranda D; Thorleifsson, Stefan G; Agren, Rasmus; Bölling, Christian; Bordel, Sergio; Chavali, Arvind K; Dobson, Paul; Dunn, Warwick B; Endler, Lukas; Hala, David; Hucka, Michael; Hull, Duncan; Jameson, Daniel; Jamshidi, Neema; Jonsson, Jon J; Juty, Nick; Keating, Sarah; Nookaew, Intawat; Le Novère, Nicolas; Malys, Naglis; Mazein, Alexander; Papin, Jason A; Price, Nathan D; Selkov, Evgeni; Sigurdsson, Martin I; Simeonidis, Evangelos; Sonnenschein, Nikolaus; Smallbone, Kieran; Sorokin, Anatoly; van Beek, Johannes H G M; Weichart, Dieter; Goryanin, Igor; Nielsen, Jens; Westerhoff, Hans V; Kell, Douglas B; Mendes, Pedro; Palsson, Bernhard Ø

    2013-05-01

    Multiple models of human metabolism have been reconstructed, but each represents only a subset of our knowledge. Here we describe Recon 2, a community-driven, consensus 'metabolic reconstruction', which is the most comprehensive representation of human metabolism that is applicable to computational modeling. Compared with its predecessors, the reconstruction has improved topological and functional features, including ∼2× more reactions and ∼1.7× more unique metabolites. Using Recon 2 we predicted changes in metabolite biomarkers for 49 inborn errors of metabolism with 77% accuracy when compared to experimental data. Mapping metabolomic data and drug information onto Recon 2 demonstrates its potential for integrating and analyzing diverse data types. Using protein expression data, we automatically generated a compendium of 65 cell type-specific models, providing a basis for manual curation or investigation of cell-specific metabolic properties. Recon 2 will facilitate many future biomedical studies and is freely available at http://humanmetabolism.org/. PMID:23455439

  2. Improved calibration of organic SST proxies via community-driven Bayesian, spatially-varying regression

    NASA Astrophysics Data System (ADS)

    Tierney, J. E.; Tingley, M.

    2013-12-01

    Improving the calibration of SST proxies is fundamental to providing accurate estimates of past changes in sea-surface temperatures. Existing calibrations assume spatially and temporally constant regression terms, but this may not adequately capture the influence of both known and unknown secondary environmental factors on proxy response. As an alternative, we propose a BAYesian, SPAtially-varying Regression model (BAYSPAR) for general application to marine organic geochemical SST proxies. The calibration model treats regression parameters as slowly-varying functions in space and allows for a full propagation of errors in both the proxy and the SST field. Initial application of the technique to the TEX86 proxy demonstrates that it yields better-behaved residuals than previous calibrations and therefore improves SST estimates in certain regions. Two different prediction models allow users to apply to the calibration to either Neogene or "deep-time" data, the latter of which uses an analog approach. Traditionally, calibrations for SST proxies are updated incrementally via individual publications over a period of many years, and in some cases the coretop collections that form these calibrations are left unarchived. To facilitate both up-to-date prediction and data archiving, BAYSPAR will be designed to reflect community-based improvements in knowledge and data in real time via a semi-autonomous updating process. Users may enter new coretop data into a portal on the web, and after a screening procedure, the data will be added to the calibration model, which will then be autonomously updated and made available to the users. In this way, calibration of SST proxies becomes a community-driven process.

  3. The Role of Community-Driven Data Curation for Enterprises

    NASA Astrophysics Data System (ADS)

    Curry, Edward; Freitas, Andre; O'Riáin, Sean

    With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.

  4. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from

  5. Community-Driven School Reform: Parents Making a Difference in Education.

    ERIC Educational Resources Information Center

    Bilby, Sheila Beachum

    2002-01-01

    Community-driven school reform is receiving greater attention now as communities become more closely involved with their schools. One such program is a grassroots, faith-based organization called People Acting for Community Together (PACT). PACT is one of about 150 community-organizing groups nationwide that have worked to improve student learning…

  6. Community-driven research on environmental sources of H. pylori infection in arctic Canada

    PubMed Central

    Hastings, Emily V; Yasui, Yutaka; Hanington, Patrick; Goodman, Karen J; Working Group, The CANHelp

    2014-01-01

    The role of environmental reservoirs in H. pylori transmission remains uncertain due to technical difficulties in detecting living organisms in sources outside the stomach. Residents of some Canadian Arctic communities worry that contamination of the natural environment is responsible for the high prevalence of H. pylori infection in the region. This analysis aims to estimate associations between exposure to potential environmental sources of biological contamination and prevalence of H. pylori infection in Arctic Canada. Using data from 3 community-driven H. pylori projects in the Northwest and Yukon Territories, we estimated effects of environmental exposures on H. pylori prevalence, using odds ratios (OR) and 95% confidence intervals (CI) from multilevel logistic regression models to adjust for household and community effects. Investigated exposures include: untreated drinking water; livestock; dogs; cats; mice or mouse droppings in the home; cleaning fish or game. Our analysis did not identify environmental exposures associated clearly with increased H. pylori prevalence, except any exposure to mice or mouse droppings (OR = 4.6, CI = 1.2–18), reported by 11% of participants. Our multilevel models showed H. pylori clustering within households, but environmental exposures accounted for little of this clustering; instead, much of it was accounted for by household composition (especially: having infected household members; number of children). Like the scientific literature on this topic, our results do not clearly implicate or rule out environmental reservoirs of H. pylori; thus, the topic remains a priority for future research. Meanwhile, H. pylori prevention research should seek strategies for reducing direct transmission from person to person. PMID:25483330

  7. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  8. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  9. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  10. Fracture design modelling

    SciTech Connect

    Crichlow, H.B.; Crichlow, H.B.

    1980-02-07

    A design tool is discussed whereby the various components that enter the design process of a hydraulic fracturing job are combined to provide a realistic appraisal of a stimulation job in the field. An interactive computer model is used to solve the problem numerically to obtain the effects of various parameters on the overall behavior of the system.

  11. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  12. The CECAM Electronic Structure Library: community-driven development of software libraries for electronic structure simulations

    NASA Astrophysics Data System (ADS)

    Oliveira, Micael

    The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.

  13. The Healthy Start Initiative: A Community-Driven Approach to Infant Mortality Reduction. Volume IV: Community Outreach.

    ERIC Educational Resources Information Center

    Lightsey, Debra, Ed.; Gwinner, Valerie, Ed.

    The Healthy Start Initiative is a national 5-year demonstration program that uses a broad range of community-driven, system development approaches to reduce infant mortality and improve the health and well-being of women, infants, children, and families. This volume, fourth in the series, deals with the topic of community outreach and is based on…

  14. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  15. A Community-Driven Intervention in Tuftonboro, New Hampshire, Succeeds in Altering Water Testing Behavior

    PubMed Central

    Paul, Michael P.; Rigrod, Pierce; Wingate, Steve; Borsuk, Mark E.

    2016-01-01

    Maximum contaminant levels created by the U.S. Environmental Protection Agency under the Safe Drinking Water Act do not apply to private wells. Rather, the onus is on individual households to undertake regular water testing. Several barriers exist to testing and treating water from private wells, including a lack of awareness about both well water as a potential source of contaminants and government-recommended water testing schedules; a health literacy level that may not be sufficient to interpret complex environmental health messages; the inconvenience of water testing; the financial costs of testing and treatment; and a myriad of available treatment options. The existence of these barriers is problematic because well water can be a source of hazardous contaminants. This article describes an initiative—undertaken by the Tuftonboro (New Hampshire) Conservation Commission, with support from state agencies and a research program at Dartmouth College—to increase water testing rates in a rural region with a relatively high number of wells. The project prompted more water tests at the state laboratory in one day than in the prior six years. This suggests that community-driven, collaborative efforts to overcome practical barriers could be successful at raising testing rates and ultimately improving public health. PMID:26738316

  16. A Community-Driven Intervention in Tuftonboro, New Hampshire, Succeeds in Altering Water Testing Behavior.

    PubMed

    Paul, Michael P; Rigrod, Pierce; Wingate, Steve; Borsuk, Mark E

    2015-12-01

    Maximum contaminant levels created by the U.S. Environmental Protection Agency under the Safe Drinking Water Act do not apply to private wells. Rather, the onus is on individual households to undertake regular water testing. Several barriers exist to testing and treating water from private wells, including a lack of awareness about both well water as a potential source of contaminants and government-recommended water testing schedules; a health literacy level that may not be sufficient to interpret complex environmental health messages; the inconvenience of water testing; the financial costs of testing and treatment; and a myriad of available treatment options. The existence of these barriers is problematic because well water can be a source of hazardous contaminants. This article describes an initiative--undertaken by the Tuftonboro (New Hampshire) Conservation Commission, with support from state agencies and a research program at Dartmouth College--to increase water testing rates in a rural region with a relatively high number of wells. The project prompted more water tests at the state laboratory in one day than in the prior six years. This suggests that community-driven, collaborative efforts to overcome practical barriers could be successful at raising testing rates and ultimately improving public health. PMID:26738316

  17. The Micronutrient Genomics Project: a community-driven knowledge base for micronutrient research.

    PubMed

    van Ommen, Ben; El-Sohemy, Ahmed; Hesketh, John; Kaput, Jim; Fenech, Michael; Evelo, Chris T; McArdle, Harry J; Bouwman, Jildau; Lietz, Georg; Mathers, John C; Fairweather-Tait, Sue; van Kranen, Henk; Elliott, Ruan; Wopereis, Suzan; Ferguson, Lynnette R; Méplan, Catherine; Perozzi, Giuditta; Allen, Lindsay; Rivero, Damariz

    2010-12-01

    Micronutrients influence multiple metabolic pathways including oxidative and inflammatory processes. Optimum micronutrient supply is important for the maintenance of homeostasis in metabolism and, ultimately, for maintaining good health. With advances in systems biology and genomics technologies, it is becoming feasible to assess the activity of single and multiple micronutrients in their complete biological context. Existing research collects fragments of information, which are not stored systematically and are thus not optimally disseminated. The Micronutrient Genomics Project (MGP) was established as a community-driven project to facilitate the development of systematic capture, storage, management, analyses, and dissemination of data and knowledge generated by biological studies focused on micronutrient-genome interactions. Specifically, the MGP creates a public portal and open-source bioinformatics toolbox for all "omics" information and evaluation of micronutrient and health studies. The core of the project focuses on access to, and visualization of, genetic/genomic, transcriptomic, proteomic and metabolomic information related to micronutrients. For each micronutrient, an expert group is or will be established combining the various relevant areas (including genetics, nutrition, biochemistry, and epidemiology). Each expert group will (1) collect all available knowledge, (2) collaborate with bioinformatics teams towards constructing the pathways and biological networks, and (3) publish their findings on a regular basis. The project is coordinated in a transparent manner, regular meetings are organized and dissemination is arranged through tools, a toolbox web portal, a communications website and dedicated publications. PMID:21189865

  18. Social responsibility and research ethics in community-driven studies of industrialized hog production.

    PubMed Central

    Wing, Steve

    2002-01-01

    Environmental health research can document exposures and health effects that result from inequitable relationships between communities of low income or people of color and the institutions that derive benefits (profits, federal and state funding or services, avoidance of wastes) from activities and policies that burden these communities. Researchers, most of whom work in relatively privileged institutions, are placed in situations of conflicting loyalties if they conduct research in collaboration with, or on behalf of, communities burdened by environmental injustices. These conflicts can threaten the self-interest of researchers and may raise social and ethical issues that do not typically arise in research projects that respond to the agendas of institutions. This article describes how we addressed issues of research ethics and social responsibility in environmental health research on industrialized hog production in North Carolina. Researchers and institutional review boards are not well prepared to address ethical issues when interests of entire communities, as well as individual research participants, are involved. Community-driven research partnerships can help address problems in research ethics and can enhance the social responsibility of researchers and their institutions. PMID:12003746

  19. Solid model design simplification

    SciTech Connect

    Ames, A.L.; Rivera, J.J.; Webb, A.J.; Hensinger, D.M.

    1997-12-01

    This paper documents an investigation of approaches to improving the quality of Pro/Engineer-created solid model data for use by downstream applications. The investigation identified a number of sources of problems caused by deficiencies in Pro/Engineer`s geometric engine, and developed prototype software capable of detecting many of these problems and guiding users towards simplified, useable models. The prototype software was tested using Sandia production solid models, and provided significant leverage in attacking the simplification problem.

  20. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  1. Family and community driven response to intimate partner violence in post-conflict settings.

    PubMed

    Kohli, Anjalee; Perrin, Nancy; Mpanano, Remy Mitima; Banywesize, Luhazi; Mirindi, Alfred Bacikenge; Banywesize, Jean Heri; Mitima, Clovis Murhula; Binkurhorhwa, Arsène Kajabika; Bufole, Nadine Mwinja; Glass, Nancy

    2015-12-01

    This study explores risk factors, individual and family consequences and community-driven responses to intimate partner violence (IPV) in post-conflict eastern Democratic Republic of Congo (DRC). This qualitative study was conducted in 3 rural villages in South Kivu Province of DRC, an area that has experienced prolonged conflict. Participants included 13 female survivors and 5 male perpetrators of IPV as reported during baseline data collection for the parent study, an impact evaluation of the Congolese-led livestock microfinance program, Pigs for Peace. Participants described social and behavioral circumstances that increase risk for IPV; social, health and economic consequences on women and their families; and resources to protect women and their families. Social and behavioral factors reported by survivors and perpetrators indicate that IPV was linked to husband's alcohol consumption, household economic instability, male desire to maintain his position as head of family and perceived disrespect of husband by wife. In addition to well-known health consequences of IPV, women reported negative social consequences, such as stigma, resulting in barriers for the well-being of the family. Survivors and perpetrators described the impact of IPV on their children, specifically the lack of proper parental guidance and lack of safety and stability that could result in the child(ren) misbehaving and using violence in their relationships resulting in further stigma towards the child and family. Strategies employed by survivors to protect themselves and family, include placating male behaviors (e.g., not responding to insults, trying to meet household demands). Perpetrators that tried to reduce the impact of IPV reported a preference for social and financial control of their partner rather than physical violence, believing this to be less severe. Participants described community and family based social support systems including couple's mediation, responsible partner and

  2. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  3. Electromagnetic modeling in accelerator designs

    SciTech Connect

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described.

  4. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  5. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  6. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  7. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  8. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  9. Innovative and Community-Driven Communication Practices of the South Carolina Cancer Prevention and Control Research Network

    PubMed Central

    Brandt, Heather M.; Freedman, Darcy A.; Adams, Swann Arp; Young, Vicki M.; Ureda, John R.; McCracken, James Lyndon; Hébert, James R.

    2014-01-01

    The South Carolina Cancer Prevention and Control Research Network (SC-CPCRN) is 1 of 10 networks funded by the Centers for Disease Control and Prevention and the National Cancer Institute (NCI) that works to reduce cancer-related health disparities. In partnership with federally qualified health centers and community stakeholders, the SC-CPCRN uses evidence-based approaches (eg, NCI Research-tested Intervention Programs) to disseminate and implement cancer prevention and control messages, programs, and interventions. We describe the innovative stakeholder- and community-driven communication efforts conducted by the SC-CPCRN to improve overall health and reduce cancer-related health disparities among high-risk and disparate populations in South Carolina. We describe how our communication efforts are aligned with 5 core values recommended for dissemination and implementation science: 1) rigor and relevance, 2) efficiency and speed, 3) collaboration, 4) improved capacity, and 5) cumulative knowledge. PMID:25058673

  10. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    NASA Astrophysics Data System (ADS)

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    Advances in ecology and environmental science increasingly depend on information from multiple disciplines to tackle broader and more complex questions about the natural world. Such advances, however, are hindered by data heterogeneity, which impedes the ability of researchers to discover, interpret, and integrate relevant data that have been collected by others. Here, we outline two community-building initiatives for improving data interoperability in the ecological and environmental sciences, one that is well-established (the Ecological Metadata Language [EML]), and another that is actively underway (a unified model for observations and measurements). EML is a metadata specification developed for the ecology discipline, and is based on prior work done by the Ecological Society of America and associated efforts to ensure a modular and extensible framework to document ecological data. EML "modules" are designed to describe one logical part of the total metadata that should be included with any ecological dataset. EML was developed through a series of working meetings, ongoing discussion forums and email lists, with participation from a broad range of ecological and environmental scientists, as well as computer scientists and software developers. Where possible, EML adopted syntax from the other metadata standards for other disciplines (e.g., Dublin Core, Content Standard for Digital Geospatial Metadata, and more). Although EML has not yet been ratified through a standards body, it has become the de facto metadata standard for a large range of ecological data management projects, including for the Long Term Ecological Research Network, the National Center for Ecological Analysis and Synthesis, and the Ecological Society of America. The second community-building initiative is based on work through the Scientific Environment for Ecological Knowledge (SEEK) as well as a recent workshop on multi-disciplinary data management. This initiative aims at improving

  11. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  12. Design issues for population growth models

    PubMed Central

    López Fidalgo, J.; Ortiz Rodríguez, I.M.

    2010-01-01

    We briefly review and discuss design issues for population growth and decline models. We then use a flexible growth and decline model as an illustrative example and apply optimal design theory to find optimal sampling times for estimating model parameters, specific parameters and interesting functions of the model parameters for the model with two real applications. Robustness properties of the optimal designs are investigated when nominal values or the model is mis-specified, and also under a different optimality criterion. To facilitate use of optimal design ideas in practice, we also introduce a website for generating a variety of optimal designs for popular models from different disciplines. PMID:21647244

  13. High and equitable tuberculosis awareness coverage in the community-driven Axshya TB control project in India

    PubMed Central

    Chadha, S. S.; Das, A.; Mohanty, S.; Tonsing, J.

    2015-01-01

    Data from surveys on knowledge, attitudes and practice (KAP) on tuberculosis (TB) conducted under the Axshya project at two time points (baseline 2010–2011 and mid-line 2012–2013) were analysed for changes in coverage and equity of TB awareness after project interventions. Overall coverage increased from 84% at baseline to 88% at midline (5% increase, P < 0.05). In comparison to baseline results, coverage at the midline survey had significantly increased, from 81% to 87% among the rural population, from 81% to 86% among women, from 73% to 85% in the ⩾55 years age group, from 71% to 80% among illiterates and from 73% to 81% in the south zone (P < 0.05). The equity gap among the different study groups (settlement, sex, age, education and zones) decreased from 6–23% at baseline to 3–11% during the midline survey. The maximum decline was observed for type of settlement (rural vs. urban), from 10% to 3% (P < 0.05). This community-driven TB control project has achieved high and equitable coverage of TB awareness, offering valuable lessons for the global community. PMID:26400604

  14. Course Design Using an Authentic Studio Model

    ERIC Educational Resources Information Center

    Wilson, Jay R.

    2013-01-01

    Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…

  15. Improving Health with Science: Exploring Community-Driven Science Education in Kenya

    NASA Astrophysics Data System (ADS)

    Leak, Anne Emerson

    This study examines the role of place-based science education in fostering student-driven health interventions. While literature shows the need to connect science with students' place and community, there is limited understanding of strategies for doing so. Making such connections is important for underrepresented students who tend to perceive learning science in school as disconnected to their experiences out of school (Aikenhead, Calabrese-Barton, & Chinn, 2006). To better understand how students can learn to connect place and community with science and engineering practices in a village in Kenya, I worked with community leaders, teachers, and students to develop and study an education program (a school-based health club) with the goal of improving knowledge of health and sanitation in a Kenyan village. While students selected the health topics and problems they hoped to address through participating in the club, the topics were taught with a focus on providing opportunities for students to learn the practices of science and health applications of these practices. Students learned chemistry, physics, environmental science, and engineering to help them address the health problems they had identified in their community. Surveys, student artifacts, ethnographic field notes, and interview data from six months of field research were used to examine the following questions: (1) In what ways were learning opportunities planned for using science and engineering practices to improve community health? (2) In what ways did students apply science and engineering practices and knowledge learned from the health club in their school, homes, and community? and (3) What factors seemed to influence whether students applied or intended to apply what they learned in the health club? Drawing on place-based science education theory and community-engagement models of health, process and structural coding (Saldana, 2013) were used to determine patterns in students' applications of their

  16. Knowledge modeling for software design

    NASA Technical Reports Server (NTRS)

    Shaw, Mildred L. G.; Gaines, Brian R.

    1992-01-01

    This paper develops a modeling framework for systems engineering that encompasses systems modeling, task modeling, and knowledge modeling, and allows knowledge engineering and software engineering to be seen as part of a unified developmental process. This framework is used to evaluate what novel contributions the 'knowledge engineering' paradigm has made and how these impact software engineering.

  17. Instructional Design Models: What a Revolution!

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2007-01-01

    This review examines instructional design models and the construction of knowledge. It further explores to identify the chilling benefits of these models for the inputs and outputs of knowledge transfer. This assessment also attempts to define instructional design models through the eyes and the minds of renowned scholars as well as the most…

  18. Designing control system information models

    NASA Technical Reports Server (NTRS)

    Panin, K. I.; Zinchenko, V. P.

    1973-01-01

    Problems encountered in modeling information models are discussed, Data cover condition, functioning of the object of control, and the environment involved in the control. Other parameters needed for the model include: (1) information for forming an image of the real situation, (2) data for analyzing and evaluating an evolving situation, (3) planning actions, and (4) data for observing and evaluating the results of model realization.

  19. Challenges in conducting community-driven research created by differing ways of talking and thinking about science: a researcher's perspective.

    PubMed

    Colquhoun, Amy; Geary, Janis; Goodman, Karen J

    2013-01-01

    Increasingly, health scientists are becoming aware that research collaborations that include community partnerships can be an effective way to broaden the scope and enhance the impact of research aimed at improving public health. Such collaborations extend the reach of academic scientists by integrating a variety of perspectives and thus strengthening the applicability of the research. Communication challenges can arise, however, when attempting to address specific research questions in these collaborations. In particular, inconsistencies can exist between scientists and community members in the use and interpretation of words and other language features, particularly when conducting research with a biomedical component. Additional challenges arise from differing perceptions of the investigative process. There may be divergent perceptions about how research questions should and can be answered, and in expectations about requirements of research institutions and research timelines. From these differences, misunderstandings can occur about how the results will ultimately impact the community. These communication issues are particularly challenging when scientists and community members are from different ethnic and linguistic backgrounds that may widen the gap between ways of talking and thinking about science, further complicating the interactions and exchanges that are essential for effective joint research efforts. Community-driven research that aims to describe the burden of disease associated with Helicobacter pylori infection is currently underway in northern Aboriginal communities located in the Yukon and Northwest Territories, Canada, with the goal of identifying effective public health strategies for reducing health risks from this infection. This research links community representatives, faculty from various disciplines at the University of Alberta, as well as territorial health care practitioners and officials. This highly collaborative work will be used to

  20. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  1. Computer Modeling and Visualization in Design Technology: An Instructional Model.

    ERIC Educational Resources Information Center

    Guidera, Stan

    2002-01-01

    Design visualization can increase awareness of issues related to perceptual and psychological aspects of design that computer-assisted design and computer modeling may not allow. A pilot university course developed core skills in modeling and simulation using visualization. Students were consistently able to meet course objectives. (Contains 16…

  2. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  3. Space Station Freedom natural environment design models

    NASA Technical Reports Server (NTRS)

    Suggs, Robert M.

    1993-01-01

    The Space Station Freedom program has established a series of natural environment models and databases for utilization in design and operations planning activities. The suite of models and databases that have either been selected from among internationally recognized standards or developed specifically for spacecraft design applications are presented. The models have been integrated with an orbit propagator and employed to compute environmental conditions for planned operations altitudes of Space Station Freedom.

  4. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  5. Building a Mobile HIV Prevention App for Men Who Have Sex With Men: An Iterative and Community-Driven Process

    PubMed Central

    McDougal, Sarah J; Sullivan, Patrick S; Stekler, Joanne D; Stephenson, Rob

    2015-01-01

    Background Gay, bisexual, and other men who have sex with men (MSM) account for a disproportionate burden of new HIV infections in the United States. Mobile technology presents an opportunity for innovative interventions for HIV prevention. Some HIV prevention apps currently exist; however, it is challenging to encourage users to download these apps and use them regularly. An iterative research process that centers on the community’s needs and preferences may increase the uptake, adherence, and ultimate effectiveness of mobile apps for HIV prevention. Objective The aim of this paper is to provide a case study to illustrate how an iterative community approach to a mobile HIV prevention app can lead to changes in app content to appropriately address the needs and the desires of the target community. Methods In this three-phase study, we conducted focus group discussions (FGDs) with MSM and HIV testing counselors in Atlanta, Seattle, and US rural regions to learn preferences for building a mobile HIV prevention app. We used data from these groups to build a beta version of the app and theater tested it in additional FGDs. A thematic data analysis examined how this approach addressed preferences and concerns expressed by the participants. Results There was an increased willingness to use the app during theater testing than during the first phase of FGDs. Many concerns that were identified in phase one (eg, disagreements about reminders for HIV testing, concerns about app privacy) were considered in building the beta version. Participants perceived these features as strengths during theater testing. However, some disagreements were still present, especially regarding the tone and language of the app. Conclusions These findings highlight the benefits of using an interactive and community-driven process to collect data on app preferences when building a mobile HIV prevention app. Through this process, we learned how to be inclusive of the larger MSM population without

  6. Minority Utility Rate Design Assessment Model

    Energy Science and Technology Software Center (ESTSC)

    2003-01-20

    Econometric model simulates consumer demand response to various user-supplied, two-part tariff electricity rate designs and assesses their economic welfare impact on black, hispanic, poor and majority households.

  7. A Model for Teaching Information Design

    ERIC Educational Resources Information Center

    Pettersson, Rune

    2011-01-01

    The author presents his views on the teaching of information design. The starting point includes some general aspects of teaching and learning. The multidisciplinary structure and content of information design as well as the combined practical and theoretical components influence studies of the discipline. Experiences from working with a model for…

  8. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1993-01-01

    Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.

  9. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  10. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  11. Instructional Design in Education: New Model

    ERIC Educational Resources Information Center

    Isman, Aytekin

    2011-01-01

    The main goal of the new instructional design model is to organize long term and full learning activities. The new model is based on the theoretical foundation of behaviorism, cognitivism and constructivism. During teaching and learning activities, learners are active and use cognitive, constructivist, or behaviorist learning to construct new…

  12. A compact MOST model for design analysis.

    NASA Technical Reports Server (NTRS)

    Kalinowski, J. J.

    1972-01-01

    A generalized extension of Kotani's (1970) design analysis model is described that is accurate for a majority of modern MOST structures. The generalized model retains much of the ease of parameter measurement characteristics of the original and enough physical correspondence to accurately represent important steady-state environmental effects, such as those caused by radiation.

  13. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  14. Propulsion System Models for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2014-01-01

    The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.

  15. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  16. A biofilm model for engineering design.

    PubMed

    Takács, I; Bye, C M; Chapman, K; Dold, P L; Fairlamb, P M; Jones, R M

    2007-01-01

    A biofilm model is presented for process engineering purposes--wastewater treatment plant design, upgrade and optimisation. The model belongs in the 1D dynamic layered biofilm model category, with modifications that allow it to be used with one parameter set for a large range of process situations. The biofilm model is integrated with a general activated sludge/anaerobic digestion model combined with a chemical equilibrium, precipitation and pH module. This allows the model to simulate the complex interactions that occur in the aerobic, anoxic and anaerobic layers of the biofilm. The model has been tested and is shown to match a variety of design guidelines, as well as experimental results from batch testing and full-scale plant operation. Both moving bed bioreactors (MBBR) and integrated fixed film activated sludge (IFAS) systems were simulated using the same model and parameter set. A new steady-state solver generates fast solutions and allows interactive design work with the complex model. PMID:17547002

  17. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  18. Designing and encoding models for synthetic biology

    PubMed Central

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-01-01

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology ‘loop’. PMID:19364720

  19. A Biologically Inspired Network Design Model

    PubMed Central

    Zhang, Xiaoge; Adamatzky, Andrew; Chan, Felix T.S.; Deng, Yong; Yang, Hai; Yang, Xin-She; Tsompanas, Michail-Antisthenis I.; Sirakoulis, Georgios Ch.; Mahadevan, Sankaran

    2015-01-01

    A network design problem is to select a subset of links in a transport network that satisfy passengers or cargo transportation demands while minimizing the overall costs of the transportation. We propose a mathematical model of the foraging behaviour of slime mould P. polycephalum to solve the network design problem and construct optimal transport networks. In our algorithm, a traffic flow between any two cities is estimated using a gravity model. The flow is imitated by the model of the slime mould. The algorithm model converges to a steady state, which represents a solution of the problem. We validate our approach on examples of major transport networks in Mexico and China. By comparing networks developed in our approach with the man-made highways, networks developed by the slime mould, and a cellular automata model inspired by slime mould, we demonstrate the flexibility and efficiency of our approach. PMID:26041508

  20. Global optimization of bilinear engineering design models

    SciTech Connect

    Grossmann, I.; Quesada, I.

    1994-12-31

    Recently Quesada and Grossmann have proposed a global optimization algorithm for solving NLP problems involving linear fractional and bilinear terms. This model has been motivated by a number of applications in process design. The proposed method relies on the derivation of a convex NLP underestimator problem that is used within a spatial branch and bound search. This paper explores the use of alternative bounding approximations for constructing the underestimator problem. These are applied in the global optimization of problems arising in different engineering areas and for which different relaxations are proposed depending on the mathematical structure of the models. These relaxations include linear and nonlinear underestimator problems. Reformulations that generate additional estimator functions are also employed. Examples from process design, structural design, portfolio investment and layout design are presented.

  1. Panoramic imaging perimeter sensor design and modeling

    SciTech Connect

    Pritchard, D.A.

    1993-12-31

    This paper describes the conceptual design and preliminary performance modeling of a 360-degree imaging sensor. This sensor combines automatic perimeter intrusion detection with immediate visual assessment and is intended to be used for fast deployment around fixed or temporary high-value assets. The sensor requirements, compiled from various government agencies, are summarized. The conceptual design includes longwave infrared and visible linear array technology. An auxiliary millimeter-wave sensing technology is also considered for use during periods of infrared and visible obscuration. The infrared detectors proposed for the sensor design are similar to the Standard Advanced Dewar Assembly Types Three A and B (SADA-IIIA/B). An overview of the sensor and processor is highlighted. The infrared performance of this sensor design has been predicted using existing thermal imaging system models and is described in the paper. Future plans for developing a prototype are also presented.

  2. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  3. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  4. Component Latent Trait Models for Test Design.

    ERIC Educational Resources Information Center

    Embretson, Susan Whitely

    Latent trait models are presented that can be used for test design in the context of a theory about the variables that underlie task performance. Examples of methods for decomposing and testing hypotheses about the theoretical variables in task performance are given. The methods can be used to determine the processing components that are involved…

  5. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  6. Community-driven learning activities, creating futures: 30,000 people can't be wrong - can they?

    PubMed

    Dowrick, Peter W

    2007-03-01

    A major vehicle for the practice of community psychology is through the organization of community-based activities. My colleagues and I have developed many programs for community learning centers, in-school and after school programs, and community technology centers. In the last 10 years, 30,000 people (mostly children) have participated in activities designed for enjoyment and learning, with a view to adding protective factors and reducing negative factors in at-risk communities. Development of these programs for literacy, education, life and work skills, has increasingly followed a community responsive model. Within each program, we created explicit images of future success. That is, people could see themselves being successful where they normally fail: self modeling with feedforward. Data reports show that individuals generalized and maintained their new skills and attitudes, but the sustainability of programs has been variable. Analysis of the variations indicates the importance of program level feedforward that brings the future into the present. The discussion includes consideration of how individual-level and community-level practices can inform each other. PMID:17437187

  7. A stress index model for balloon design

    NASA Technical Reports Server (NTRS)

    Smith, I. S.

    1987-01-01

    A NASA stress index model, SINDEX, is discussed which establishes the relative stress magnitudes along a balloon gore as a function of altitude. Application of the model to a data base of over 550 ballon flights demonstrates the effectiveness of the method. The results show a strong correlation between stress levels, failure rates, and the point of maximum stress coinciding with the observed failure locations. It is suggested that the model may be used during the balloon design process to lower the levels of stress in the balloon.

  8. Jovian plasma modeling for mission design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  9. Jovian Plasma Modeling for Mission Design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  10. EUV Focus Sensor: Design and Modeling

    SciTech Connect

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.

  11. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  12. Modelling, analyses and design of switching converters

    NASA Technical Reports Server (NTRS)

    Cuk, S. M.; Middlebrook, R. D.

    1978-01-01

    A state-space averaging method for modelling switching dc-to-dc converters for both continuous and discontinuous conduction mode is developed. In each case the starting point is the unified state-space representation, and the end result is a complete linear circuit model, for each conduction mode, which correctly represents all essential features, namely, the input, output, and transfer properties (static dc as well as dynamic ac small-signal). While the method is generally applicable to any switching converter, it is extensively illustrated for the three common power stages (buck, boost, and buck-boost). The results for these converters are then easily tabulated owing to the fixed equivalent circuit topology of their canonical circuit model. The insights that emerge from the general state-space modelling approach lead to the design of new converter topologies through the study of generic properties of the cascade connection of basic buck and boost converters.

  13. Cryogenic Wind Tunnel Models. Design and Fabrication

    NASA Technical Reports Server (NTRS)

    Young, C. P., Jr. (Compiler); Gloss, B. B. (Compiler)

    1983-01-01

    The principal motivating factor was the National Transonic Facility (NTF). Since the NTF can achieve significantly higher Reynolds numbers at transonic speeds than other wind tunnels in the world, and will therefore occupy a unique position among ground test facilities, every effort is being made to ensure that model design and fabrication technology exists to allow researchers to take advantage of this high Reynolds number capability. Since a great deal of experience in designing and fabricating cryogenic wind tunnel models does not exist, and since the experience that does exist is scattered over a number of organizations, there is a need to bring existing experience in these areas together and share it among all interested parties. Representatives from government, the airframe industry, and universities are included.

  14. Human visual performance model for crewstation design

    NASA Technical Reports Server (NTRS)

    Larimer, James; Prevost, Michael; Arditi, Aries; Azueta, Steven; Bergen, James; Lubin, Jeffrey

    1991-01-01

    An account is given of a Visibility Modeling Tool (VMT) which furnishes a crew-station designer with the means to assess configurational tradeoffs, with a view to the impact of various options on the unambiguous access of information to the pilot. The interactive interface of the VMT allows the manipulation of cockpit geometry, ambient lighting, pilot ergonomics, and the displayed symbology. Performance data can be displayed in the form of 3D contours into the crewstation graphic model, thereby yielding an indication of the operator's visual capabilities.

  15. Thermal Transport Model for Heat Sink Design

    NASA Technical Reports Server (NTRS)

    Chervenak, James A.; Kelley, Richard L.; Brown, Ari D.; Smith, Stephen J.; Kilbourne, Caroline a.

    2009-01-01

    A document discusses the development of a finite element model for describing thermal transport through microcalorimeter arrays in order to assist in heat-sinking design. A fabricated multi-absorber transition edge sensor (PoST) was designed in order to reduce device wiring density by a factor of four. The finite element model consists of breaking the microcalorimeter array into separate elements, including the transition edge sensor (TES) and the silicon substrate on which the sensor is deposited. Each element is then broken up into subelements, whose surface area subtends 10 10 microns. The heat capacity per unit temperature, thermal conductance, and thermal diffusivity of each subelement are the model inputs, as are the temperatures of each subelement. Numerical integration using the Finite in Time Centered in Space algorithm of the thermal diffusion equation is then performed in order to obtain a temporal evolution of the subelement temperature. Thermal transport across interfaces is modeled using a thermal boundary resistance obtained using the acoustic mismatch model. The document concludes with a discussion of the PoST fabrication. PoSTs are novel because they enable incident x-ray position sensitivity with good energy resolution and low wiring density.

  16. Modeling and design of millimeter wave gyroklystrons

    NASA Astrophysics Data System (ADS)

    Levush, B.; Blank, M.; Calame, J.; Danly, B.; Nguyen, K.; Pershing, D.; Cooke, S.; Latham, P.; Petillo, J.; Antonsen, T.

    1999-05-01

    A series of high power, high efficiency Ka-band and W-band gyroklystron experiments has been conducted recently at the Naval Research Laboratory (NRL). Stagger tuning of the cavities for bandwidth enhancement is commonly used in the conventional multicavity klystrons. The desired stagger tuning is usually achieved via mechanical tuning of the individual cavities. However, in the millimeter wave regime, particularly, in the case of the high average power operation, it is desirable to be able to achieve the required stagger tuning by design. The NRL gyroklystron experiments explored the tradeoffs between power, bandwidth, efficiency, and gain to study the effects of large stagger tuning in millimeter wave without resorting to mechanical tuning of the cavities. Both, Ka-band and W-band, experiments demonstrated a record power-bandwidth product. The success of the experiments was due in large part to a battery of improved large-signal, stability, and cold test codes employed in the modeling and design stage. Theoretical models that provide the basis for these design simulation tools and the design methodology will be presented.

  17. Understanding backward design to strengthen curricular models.

    PubMed

    Emory, Jan

    2014-01-01

    Nurse educators have responded to the call for transformation in education. Challenges remain in planning curricular implementation to facilitate understanding of essential content for student success on licensure examinations and in professional practice. The conceptual framework Backward Design (BD) can support and guide curriculum decisions. Using BD principles in conjunction with educational models can strengthen and improve curricula. This article defines and describes the BD process, and identifies reported benefits for nursing education. PMID:24743175

  18. Design and modelling of a SMES coil

    NASA Astrophysics Data System (ADS)

    Yuan, Weijia; Campbell, A. M.; Coombs, T. A.

    2010-06-01

    The design of a Superconducting Magnetic Energy Storage (SMES) coil wound by coated conductors has been presented. Based on an existing model for coated conductor pancake coils, this paper analysed the magnetic field and current density distribution of the coil at two different operation temperatures, 77K and 22K. A comparison table of the critical currents and AC losses at these two temperatures has been presented. Several steps to improve the transport current of the coil have been suggested as well.

  19. Models of Design: Envisioning a Future Design Education

    ERIC Educational Resources Information Center

    Friedman, Ken

    2012-01-01

    This article offers a large-scale view of how design fits in the world economy today, and the role of design education in preparing designers for their economic and professional role. The current context of design involves broad-based historical changes including a major redistribution of geopolitical and industrial power from the West to the…

  20. Computerized design of controllers using data models

    NASA Technical Reports Server (NTRS)

    Irwin, Dennis; Mitchell, Jerrel; Medina, Enrique; Allwine, Dan; Frazier, Garth; Duncan, Mark

    1995-01-01

    The major contributions of the grant effort have been the enhancement of the Compensator Improvement Program (CIP), which resulted in the Ohio University CIP (OUCIP) package, and the development of the Model and Data-Oriented Computer Aided Design System (MADCADS). Incorporation of direct z-domain designs into CIP was tested and determined to be numerically ill-conditioned for the type of lightly damped problems for which the development was intended. Therefore, it was decided to pursue the development of z-plane designs in the w-plane, and to make this conversion transparent to the user. The analytical development needed for this feature, as well as that needed for including compensator damping ratios and DC gain specifications, closed loop stability requirements, and closed loop disturbance rejection specifications into OUCIP are all contained in Section 3. OUCIP was successfully tested with several example systems to verify proper operation of existing and new features. The extension of the CIP philosophy and algorithmic approach to handle modern multivariable controller design criteria was implemented and tested. Several new algorithms for implementing the search approach to modern multivariable control system design were developed and tested. This analytical development, most of which was incorporated into the MADCADS software package, is described in Section 4, which also includes results of the application of MADCADS to the MSFC ACES facility and the Hubble Space Telescope.

  1. Reduced cost mission design using surrogate models

    NASA Astrophysics Data System (ADS)

    Feldhacker, Juliana D.; Jones, Brandon A.; Doostan, Alireza; Hampton, Jerrad

    2016-01-01

    This paper uses surrogate models to reduce the computational cost associated with spacecraft mission design in three-body dynamical systems. Sampling-based least squares regression is used to project the system response onto a set of orthogonal bases, providing a representation of the ΔV required for rendezvous as a reduced-order surrogate model. Models are presented for mid-field rendezvous of spacecraft in orbits in the Earth-Moon circular restricted three-body problem, including a halo orbit about the Earth-Moon L2 libration point (EML-2) and a distant retrograde orbit (DRO) about the Moon. In each case, the initial position of the spacecraft, the time of flight, and the separation between the chaser and the target vehicles are all considered as design inputs. The results show that sample sizes on the order of 102 are sufficient to produce accurate surrogates, with RMS errors reaching 0.2 m/s for the halo orbit and falling below 0.01 m/s for the DRO. A single function call to the resulting surrogate is up to two orders of magnitude faster than computing the same solution using full fidelity propagators. The expansion coefficients solved for in the surrogates are then used to conduct a global sensitivity analysis of the ΔV on each of the input parameters, which identifies the separation between the spacecraft as the primary contributor to the ΔV cost. Finally, the models are demonstrated to be useful for cheap evaluation of the cost function in constrained optimization problems seeking to minimize the ΔV required for rendezvous. These surrogate models show significant advantages for mission design in three-body systems, in terms of both computational cost and capabilities, over traditional Monte Carlo methods.

  2. Steel shear walls, behavior, modeling and design

    SciTech Connect

    Astaneh-Asl, Abolhassan

    2008-07-08

    In recent years steel shear walls have become one of the more efficient lateral load resisting systems in tall buildings. The basic steel shear wall system consists of a steel plate welded to boundary steel columns and boundary steel beams. In some cases the boundary columns have been concrete-filled steel tubes. Seismic behavior of steel shear wall systems during actual earthquakes and based on laboratory cyclic tests indicates that the systems are quite ductile and can be designed in an economical way to have sufficient stiffness, strength, ductility and energy dissipation capacity to resist seismic effects of strong earthquakes. This paper, after summarizing the past research, presents the results of two tests of an innovative steel shear wall system where the boundary elements are concrete-filled tubes. Then, a review of currently available analytical models of steel shear walls is provided with a discussion of capabilities and limitations of each model. We have observed that the tension only 'strip model', forming the basis of the current AISC seismic design provisions for steel shear walls, is not capable of predicting the behavior of steel shear walls with length-to-thickness ratio less than about 600 which is the range most common in buildings. The main reasons for such shortcomings of the AISC seismic design provisions for steel shear walls is that it ignores the compression field in the shear walls, which can be significant in typical shear walls. The AISC method also is not capable of incorporating stresses in the shear wall due to overturning moments. A more rational seismic design procedure for design of shear walls proposed in 2000 by the author is summarized in the paper. The design method, based on procedures used for design of steel plate girders, takes into account both tension and compression stress fields and is applicable to all values of length-to-thickness ratios of steel shear walls. The method is also capable of including the effect of

  3. Model Reduction for Control System Design

    NASA Technical Reports Server (NTRS)

    Enns, D. F.

    1985-01-01

    An approach and a technique for effectively obtaining reduced order mathematical models of a given large order model for the purposes of synthesis, analysis and implementation of control systems is developed. This approach involves the use of an error criterion which is the H-infinity norm of a frequency weighted error between the full and reduced order models. The weightings are chosen to take into account the purpose for which the reduced order model is intended. A previously unknown error bound in the H-infinity norm for reduced order models obtained from internally balanced realizations was obtained. This motivated further development of the balancing technique to include the frequency dependent weightings. This resulted in the frequency weighted balanced realization and a new model reduction technique. Two approaches to designing reduced order controllers were developed. The first involves reducing the order of a high order controller with an appropriate weighting. The second involves linear quadratic Gaussian synthesis based on a reduced order model obtained with an appropriate weighting.

  4. Model-Based Design of Biochemical Microreactors

    PubMed Central

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M.; Voll, Lars M.; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  5. Model-Based Design of Biochemical Microreactors.

    PubMed

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  6. Design of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.

    1972-01-01

    An edited version is reported of notes distributed at the Summer Workshop on the UCLA General Circulation Model in June 1971. It presents the computational schemes of the UCLA model, along with the mathematical and physical principles on which these schemes are based. Included are the finite difference schemes for the governing fluid-dynamical equations, designed to maintain the important integral constraints and dispersion characteristics of the motion. Also given are the principles of parameterization of cumulus convection by an ensemble of identical clouds. A model of the ground hydrology, involving the liquid, ice and snow states of water, is included. A short summary is given of the scheme for computing solar and infrared radiation transfers through clear and cloudy air.

  7. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  8. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  9. Observer design for a schistosomiasis model.

    PubMed

    Diaby, Mouhamadou; Iggidr, Abderrahman; Sy, Mamadou

    2015-11-01

    This paper deals with the state estimation for a schistosomiasis infection dynamical model described by a continuous nonlinear system when only the infected human population is measured. The central idea is studied following two major angles. On the one hand, when all the parameters of the model are supposed to be well known, we construct a simple observer and a high-gain Luenberger observer based on a canonical controller form and conceived for the nonlinear dynamics where it is implemented. On the other hand, when the nonlinear uncertain continuous-time system is in a bounded-error context, we introduce a method for designing a guaranteed interval observer. Numerical simulations are included in order to test the behavior and the performance of the given observers. PMID:26334676

  10. Generalized mathematical models in design optimization

    NASA Technical Reports Server (NTRS)

    Papalambros, Panos Y.; Rao, J. R. Jagannatha

    1989-01-01

    The theory of optimality conditions of extremal problems can be extended to problems continuously deformed by an input vector. The connection between the sensitivity, well-posedness, stability and approximation of optimization problems is steadily emerging. The authors believe that the important realization here is that the underlying basis of all such work is still the study of point-to-set maps and of small perturbations, yet what has been identified previously as being just related to solution procedures is now being extended to study modeling itself in its own right. Many important studies related to the theoretical issues of parametric programming and large deformation in nonlinear programming have been reported in the last few years, and the challenge now seems to be in devising effective computational tools for solving these generalized design optimization models.

  11. Physiological modeling for hearing aid design

    NASA Astrophysics Data System (ADS)

    Bruce, Ian C.; Young, Eric D.; Sachs, Murray B.

    2002-05-01

    Physiological data from hearing-impaired cats suggest that conventional hearing aid signal-processing schemes do not restore normal auditory-nerve responses to a vowel [Miller et al., J. Acoust. Soc. Am. 101, 3602 (1997)] and can even produce anomalous and potentially confounding patterns of activity [Schilling et al., Hear. Res. 117, 57 (1998)]. These deficits in the neural representation may account at least partially for poor speech perception in some hearing aid users. An amplification scheme has been developed that produces neural responses to a vowel more like those seen in normal cats and that reduces confounding responses [Miller et al., J. Acoust. Soc. Am. 106, 2693 (1999)]. A physiologically accurate model of the normal and impaired auditory periphery would provide simpler and quicker testing of such potential hearing aid designs. Details of such a model, based on that of Zhang et al. [J. Acoust. Soc. Am. 109, 648 (2001)], will be presented. Model predictions suggest that impairment of both outer- and inner-hair cells contribute to the degraded representation of vowels in hearing-impaired cats. The model is currently being used to develop and test a generalization of the Miller et al. speech-processing algorithm described above to running speech. [Work supported by NIDCD Grants DC00109 and DC00023.] a)Now with the Dept. of Electrical and Computer Engineering, McMaster Univ., 1280 Main St. W., Hamilton, ON L8S 4K1, Canada.

  12. Model to Design Drip Hose Lateral Line

    NASA Astrophysics Data System (ADS)

    Ludwig, Rafael; Cury Saad, João Carlos

    2014-05-01

    Introduction The design criterion for non-pressure compensating drip hose is normally to have 10% of flow variation (Δq) in the lateral line, corresponding to 20% of head pressure variation (ΔH). Longer lateral lines in drip irrigation systems using conventional drippers provide cost reduction, but it is necessary to obtain to the uniformity of irrigation [1]. The use of Δq higher levels can provide longer lateral lines. [4] proposes the use of a 30% Δq and he found that this value resulted in distribution uniformity over 80%. [1] considered it is possible to extend the lateral line length using two emitters spacing in different section. He assumed that the spacing changing point would be at 40% of the total length, because this is approximately the location of the average flow according with [2]. [3] found that, for practical purposes, the average pressure is located at 40% of the length of the lateral line and that until this point it has already consumed 75% of total pressure head loss (hf ). In this case, the challenge for designers is getting longer lateral lines with high values of uniformity. Objective The objective of this study was to develop a model to design longer lateral lines using non-pressure compensating drip hose. Using the developed model, the hypotheses to be evaluated were: a) the use of two different spacing between emitters in the same lateral line allows longer length; b) it is possible to get longer lateral lines using high values of pressure variation in the lateral lines since the distribution uniformity stays below allowable limits. Methodology A computer program was developed in Delphi® based on the model developed and it is able to design lateral lines in level using non-pressure compensating drip hose. The input data are: desired distribution uniformity (DU); initial and final pressure in the lateral line; coefficients of relationship between emitter discharge and pressure head; hose internal diameter; pipe cross-sectional area

  13. The Healthy Start Initiative: A Community-Driven Approach to Infant Mortality Reduction--Vol. I. Consortia Development.

    ERIC Educational Resources Information Center

    McCoy-Thompson, Meri

    The purpose of the Healthy Start Initiative, a national demonstration program, is to reduce infant mortality by 50 percent in 15 communities. At the heart of the initiative is the belief that the community, guided by a consortium of individuals and organizations from many sectors, can best design and implement the services needed by the women and…

  14. Modeling and control design of a wind tunnel model support

    NASA Technical Reports Server (NTRS)

    Howe, David A.

    1990-01-01

    The 12-Foot Pressure Wind Tunnel at Ames Research Center is being restored. A major part of the restoration is the complete redesign of the aircraft model supports and their associated control systems. An accurate trajectory control servo system capable of positioning a model (with no measurable overshoot) is needed. Extremely small errors in scaled-model pitch angle can increase airline fuel costs for the final aircraft configuration by millions of dollars. In order to make a mechanism sufficiently accurate in pitch, a detailed structural and control-system model must be created and then simulated on a digital computer. The model must contain linear representations of the mechanical system, including masses, springs, and damping in order to determine system modes. Electrical components, both analog and digital, linear and nonlinear must also be simulated. The model of the entire closed-loop system must then be tuned to control the modes of the flexible model-support structure. The development of a system model, the control modal analysis, and the control-system design are discussed.

  15. Making designer mutants in model organisms

    PubMed Central

    Peng, Ying; Clark, Karl J.; Campbell, Jarryd M.; Panetta, Magdalena R.; Guo, Yi; Ekker, Stephen C.

    2014-01-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms. PMID:25336735

  16. S-IC Test Stand Design Model

    NASA Technical Reports Server (NTRS)

    1962-01-01

    At its founding, the Marshall Space Flight Center (MSFC) inherited the Army's Jupiter and Redstone test stands, but much larger facilities were needed for the giant stages of the Saturn V. From 1960 to 1964, the existing stands were remodeled and a sizable new test area was developed. The new comprehensive test complex for propulsion and structural dynamics was unique within the nation and the free world, and they remain so today because they were constructed with foresight to meet the future as well as on going needs. Construction of the S-IC Static test stand complex began in 1961 in the west test area of MSFC, and was completed in 1964. The S-IC static test stand was designed to develop and test the 138-ft long and 33-ft diameter Saturn V S-IC first stage, or booster stage, weighing in at 280,000 pounds. Required to hold down the brute force of a 7,500,000-pound thrust produced by 5 F-1 engines, the S-IC static test stand was designed and constructed with the strength of hundreds of tons of steel and 12,000,000 pounds of cement, planted down to bedrock 40 feet below ground level. The foundation walls, constructed with concrete and steel, are 4 feet thick. The base structure consists of four towers with 40-foot-thick walls extending upward 144 feet above ground level. The structure was topped by a crane with a 135-foot boom. With the boom in the upright position, the stand was given an overall height of 405 feet, placing it among the highest structures in Alabama at the time. This photo is of the S-IC test stand design model created prior to construction.

  17. S-IC Test Stand Design Model

    NASA Technical Reports Server (NTRS)

    1962-01-01

    At its founding, the Marshall Space Flight Center (MSFC) inherited the Army's Jupiter and Redstone test stands, but much larger facilities were needed for the giant stages of the Saturn V. From 1960 to 1964, the existing stands were remodeled and a sizable new test area was developed. The new comprehensive test complex for propulsion and structural dynamics was unique within the nation and the free world, and they remain so today because they were constructed with foresight to meet the future as well as on going needs. Construction of the S-IC Static test stand complex began in 1961 in the west test area of MSFC, and was completed in 1964. The S-IC static test stand was designed to develop and test the 138-ft long and 33-ft diameter Saturn V S-IC first stage, or booster stage, weighing in at 280,000 pounds. Required to hold down the brute force of a 7,500,000-pound thrust produced by 5 F-1 engines, the S-IC static test stand was designed and constructed with the strength of hundreds of tons of steel and 12,000,000 pounds of cement, planted down to bedrock 40 feet below ground level. The foundation walls, constructed with concrete and steel, are 4 feet thick. The base structure consists of four towers with 40-foot-thick walls extending upward 144 feet above ground level. The structure was topped by a crane with a 135-foot boom. With the boom in the upright position, the stand was given an overall height of 405 feet, placing it among the highest structures in Alabama at the time. This photo is of the S-IC test stand design model.

  18. Xylose fermentation: Analysis, modelling, and design

    SciTech Connect

    Slininger, P.J.W.

    1988-01-01

    Ethanolic fermentation is a means of utilizing xylose-rich industrial wastes, but an optimized bioprocess is lacking. Pachysolen tannophilus NRRL Y-7124 was the first yeast discovered capable of significant ethanol production from xylose and has served as a model for studies of other yeasts mediating this conversion. However, a comparative evaluation of strains led the authors to focus on Pichia stipitis NRRL Y-7124 as the yeast with highest potential for application. Given 150 g/l xylose in complex medium, strain Y-7124 functioned optimally at 25-26C pH 4-7 to accumulate 56 g/l ethanol with negligible xylitol production. Dissolved oxygen concentration was critical to cell growth; and in order to measure it accurately, a colorimetric assay was developed to allow calibration of electrodes based on oxygen solubility in media of varying composition. Specific growth rate was a Monod function of limiting substrate concentration (oxygen and/or xylose). Both specific ethanol productivity and oxygen uptake rate were growth-associated, but only the former was maintenance-associated. Both growth and fermentation were inhibited by high xylose and ethanol concentrations. Carbon and cofactor balances supported modelling xylose metabolism as a combination of four processes: assimilation, pentose phosphate oxidation, respiration, and ethanolic fermentation. A mathematical model describing the stoichiometry and kinetics was constructed, and its predictive capacity was confirmed by comparing simulated and experimental batch cultures. Consideration of example processes indicated that this model constitutes an important tool for designing the optimum bioprocess for utilizing xylose-rich wastes.

  19. Micromachined accelerometer design, modeling and validation

    SciTech Connect

    Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

    1998-04-01

    Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

  20. A Design Model: The Autism Spectrum Disorder Classroom Design Kit

    ERIC Educational Resources Information Center

    McAllister, Keith; Maguire, Barry

    2012-01-01

    Architects and designers have a responsibility to provide an inclusive built environment. However, for those with a diagnosis of autism spectrum disorder (ASD), the built environment can be a frightening and confusing place, difficult to negotiate and tolerate. The challenge of integrating more fully into society is denied by an alienating built…

  1. Complex Educational Design: A Course Design Model Based on Complexity

    ERIC Educational Resources Information Center

    Freire, Maximina Maria

    2013-01-01

    Purpose: This article aims at presenting a conceptual framework which, theoretically grounded on complexity, provides the basis to conceive of online language courses that intend to respond to the needs of students and society. Design/methodology/approach: This paper is introduced by reflections on distance education and on the paradigmatic view…

  2. Advancing Long Tail Data Capture and Access Through Trusted, Community-Driven Data Services at the IEDA Data Facility

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Ferrini, V.; Hsu, L.; Arko, R. A.; Walker, J. D.; O'hara, S. H.

    2012-12-01

    Substantial volumes of data in the Earth Sciences are collected in small- to medium-size projects by individual investigators or small research teams, known as the 'Long Tail' of science. Traditionally, these data have largely stayed 'in the dark', i.e. they have not been properly archived, and have therefore been inaccessible and underutilized. The primary reason has been the lack of appropriate infrastructure, from adequate repositories to resources and support for investigators to properly manage their data, to community standards and best practices. Lack of credit for data management and for the data themselves has contributed to the reluctance of investigators to share their data. IEDA (Integrated Earth Data Applications), a NSF-funded data facility for solid earth geoscience data, has developed a comprehensive suite of data services that are designed to address the concerns and needs of investigators. IEDA's data publication service registers datasets with DOI and ensures their proper citation and attribution. IEDA is working with publishers on advanced linkages between datasets in the IEDA repository and scientific online articles to facilitate access to the data, enhance their visibility, and augment their use and citation. IEDA's investigator support ranges from individual support for data management to tools, tutorials, and virtual or face-to-face workshops that guide and assist investigators with data management planning, data submission, and data documentation. A critical aspect of IEDA's concept has been the disciplinary expertise within the team and its strong liaison with the science community, as well as a community-based governance. These have been fundamental to gain the trust and support of the community that have lead to significantly improved data preservation and access in the communities served by IEDA.

  3. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1988-01-01

    Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.

  4. Employing ISRU Models to Improve Hardware Design

    NASA Technical Reports Server (NTRS)

    Linne, Diane L.

    2010-01-01

    An analytical model for hydrogen reduction of regolith was used to investigate the effects of several key variables on the energy and mass performance of reactors for a lunar in-situ resource utilization oxygen production plant. Reactor geometry, reaction time, number of reactors, heat recuperation, heat loss, and operating pressure were all studied to guide hardware designers who are developing future prototype reactors. The effects of heat recuperation where the incoming regolith is pre-heated by the hot spent regolith before transfer was also investigated for the first time. In general, longer reaction times per batch provide a lower overall energy, but also result in larger and heavier reactors. Three reactors with long heat-up times results in similar energy requirements as a two-reactor system with all other parameters the same. Three reactors with heat recuperation results in energy reductions of 20 to 40 percent compared to a three-reactor system with no heat recuperation. Increasing operating pressure can provide similar energy reductions as heat recuperation for the same reaction times.

  5. Reshaping the Boundaries of Community Engagement in Design Education: Global and Local Explorations

    ERIC Educational Resources Information Center

    Hicks, Travis L.; Radtke, Rebekah Ison

    2015-01-01

    Community-driven design is a current movement in the forefront of many designers' practices and on university campuses in design programs. The authors examine work from their respective public state universities' design programs as examples of best practices. In these case studies, the authors share experiences using community-based design…

  6. Rapid Modeling, Assembly and Simulation in Design Optimization

    NASA Technical Reports Server (NTRS)

    Housner, Jerry

    1997-01-01

    A new capability for design is reviewed. This capability provides for rapid assembly of detail finite element models early in the design process where costs are most effectively impacted. This creates an engineering environment which enables comprehensive analysis and design optimization early in the design process. Graphical interactive computing makes it possible for the engineer to interact with the design while performing comprehensive design studies. This rapid assembly capability is enabled by the use of Interface Technology, to couple independently created models which can be archived and made accessible to the designer. Results are presented to demonstrate the capability.

  7. A Workforce Design Model: Providing Energy to Organizations in Transition

    ERIC Educational Resources Information Center

    Halm, Barry J.

    2011-01-01

    The purpose of this qualitative study was to examine the change in performance realized by a professional services organization, which resulted in the Life Giving Workforce Design (LGWD) model through a grounded theory research design. This study produced a workforce design model characterized as an organizational blueprint that provides virtuous…

  8. Construction of an Instructional Design Model for Undergraduate Chemistry Laboratory Design: A Delphi Approach

    ERIC Educational Resources Information Center

    Bunag, Tara

    2012-01-01

    The purpose of this study was to construct an instructional systems design model for chemistry teaching laboratories at the undergraduate level to accurately depict the current practices of design experts. This required identifying the variables considered during design, prioritizing and ordering these variables, and constructing a model. Experts…

  9. Designing Electronic Performance Support Systems: Models and Instructional Strategies Employed

    ERIC Educational Resources Information Center

    Nekvinda, Christopher D.

    2011-01-01

    The purpose of this qualitative study was to determine whether instructional designers and performance technologists utilize instructional design models when designing and developing electronic performance support systems (EPSS). The study also explored if these same designers were utilizing instructional strategies within their EPSS to support…

  10. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  11. Learning from the Pros: How Experienced Designers Translate Instructional Design Models into Practice

    ERIC Educational Resources Information Center

    Ertmer, Peggy A.; York, Cindy S.; Gedik, Nuray

    2009-01-01

    Understanding how experienced designers approach complex design problems provides new perspectives on how they translate instructional design (ID) models and processes into practice. In this article, the authors describe the results of a study in which 16 "seasoned" designers shared compelling stories from practice that offered insights into their…

  12. Designing and Evaluating Representations to Model Pedagogy

    ERIC Educational Resources Information Center

    Masterman, Elizabeth; Craft, Brock

    2013-01-01

    This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit…

  13. Aftbody Closure Model Design: Lessons Learned

    NASA Technical Reports Server (NTRS)

    Capone, Francis J.

    1999-01-01

    An Aftbody Closure Test Program is necessary in order to provide aftbody drag increments that can be added to the drag polars produced by testing the performance models (models 2a and 2b). These models had a truncated fuselage, thus, drag was measured for an incomplete configuration. In addition, trim characteristics cannot be determined with a model with a truncated fuselage. The stability and control tests were conducted with a model (model 20) having a flared aftbody. This type aftbody was needed in order to provide additional clearance between the base of the model and the sting. This was necessary because the high loads imposed on the model for stability and control tests result in large model deflections. For this case, the aftbody model will be used to validate stability and control performance.

  14. Analysis Grid for Environments Linked to Obesity (ANGELO) framework to develop community-driven health programmes in an Indigenous community in Canada.

    PubMed

    Willows, Noreen; Dyck Fehderau, David; Raine, Kim D

    2016-09-01

    Indigenous First Nations people in Canada have high chronic disease morbidity resulting in part from enduring social inequities and colonialism. Obesity prevention strategies developed by and for First Nations people are crucial to improving the health status of this group. The research objective was to develop community-relevant strategies to address childhood obesity in a First Nations community. Strategies were derived from an action-based workshop based on the Analysis Grid for Environments Linked to Obesity (ANGELO) framework. Thirteen community members with wide-ranging community representation took part in the workshop. They combined personal knowledge and experience with community-specific and national research to dissect the broad array of environmental factors that influenced childhood obesity in their community. They then developed community-specific action plans focusing on healthy eating and physical activity for children and their families. Actions included increasing awareness of children's health issues among the local population and community leadership, promoting nutrition and physical activity at school, and improving recreation opportunities. Strengthening children's connection to their culture was considered paramount to improving their well-being; thus, workshop participants developed programmes that included elders as teachers and reinforced families' acquaintance with First Nations foods and activities. The research demonstrated that the ANGELO framework is a participatory way to develop community-driven health programmes. It also demonstrated that First Nations people involved in the creation of solutions to health issues in their communities may focus on decolonising approaches such as strengthening their connection to indigenous culture and traditions. External funds were not available to implement programmes and there was no formal follow-up to determine if community members implemented programmes. Future research needs to examine the

  15. Shuttle passenger couch. [design and performance of engineering model

    NASA Technical Reports Server (NTRS)

    Rosener, A. A.; Stephenson, M. L.

    1974-01-01

    Conceptual design and fabrication of a full scale shuttle passenger couch engineering model are reported. The model was utilized to verify anthropometric dimensions, reach dimensions, ingress/egress, couch operation, storage space, restraint locations, and crew acceptability. These data were then incorported in the design of the passenger couch verification model that underwent performance tests.

  16. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  17. Building an Online Wisdom Community: A Transformational Design Model

    ERIC Educational Resources Information Center

    Gunawardena, Charlotte N.; Jennings, Barbara; Ortegano-Layne, Ludmila C.; Frechette, Casey; Carabajal, Kayleigh; Lindemann, Ken; Mummert, Julia

    2004-01-01

    This paper discusses the development of a new instructional design model based on socioconstructivist learning theories and distance education principles for the design of online wisdom communities and the efficacy of the model drawing on evaluation results from its implementation in Fall 2002. The model, Final Outcome Centered Around Learner…

  18. Simulation Tools Model Icing for Aircraft Design

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Here s a simple science experiment to try: Place an unopened bottle of distilled water in your freezer. After 2-3 hours, if the water is pure enough, you will notice that it has not frozen. Carefully pour the water into a bowl with a piece of ice in it. When it strikes the ice, the water will instantly freeze. One of the most basic and commonly known scientific facts is that water freezes at around 32 F. But this is not always the case. Water lacking any impurities for ice crystals to form around can be supercooled to even lower temperatures without freezing. High in the atmosphere, water droplets can achieve this delicate, supercooled state. When a plane flies through clouds containing these droplets, the water can strike the airframe and, like the supercooled water hitting the ice in the experiment above, freeze instantly. The ice buildup alters the aerodynamics of the plane - reducing lift and increasing drag - affecting its performance and presenting a safety issue if the plane can no longer fly effectively. In certain circumstances, ice can form inside aircraft engines, another potential hazard. NASA has long studied ways of detecting and countering atmospheric icing conditions as part of the Agency s efforts to enhance aviation safety. To do this, the Icing Branch at Glenn Research Center utilizes a number of world-class tools, including the Center s Icing Research Tunnel and the NASA 607 icing research aircraft, a "flying laboratory" for studying icing conditions. The branch has also developed a suite of software programs to help aircraft and icing protection system designers understand the behavior of ice accumulation on various surfaces and in various conditions. One of these innovations is the LEWICE ice accretion simulation software. Initially developed in the 1980s (when Glenn was known as Lewis Research Center), LEWICE has become one of the most widely used tools in icing research and aircraft design and certification. LEWICE has been transformed over

  19. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  20. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  1. Evolutionary objections to "alien design" models.

    NASA Astrophysics Data System (ADS)

    Coffey, E. J.

    A previous paper demonstrated that the principal supporters of SETI have ignored the biological and evolutionary consequences of a creature's body form. In fact, the supporting evidence they provide actually contradicts their view. The approach they employ is that of the engineer: the process of "designing" a hypothetical creature to a specification irrespective of biological or evolutionary considerations. The principal types of "alien designs" which have been employed shall be discussed, and the evolutionary objections to them given.

  2. The Training Wheel. A Simple Model for Instructional Design.

    ERIC Educational Resources Information Center

    Rogoff, Rosalind L.

    1984-01-01

    The author developed an instructional-design model consisting of four simple steps. The model is in a circular format, rather than the usual linear series form, so it is named the training wheel. (SSH)

  3. Data Base Design Using Entity-Relationship Models.

    ERIC Educational Resources Information Center

    Davis, Kathi Hogshead

    1983-01-01

    The entity-relationship (ER) approach to database design is defined, and a specific example of an ER model (personnel-payroll) is examined. The requirements for converting ER models into specific database management systems are discussed. (Author/MSE)

  4. Humus and humility in ecosystem model design

    NASA Astrophysics Data System (ADS)

    Rowe, Ed

    2015-04-01

    Prediction is central to science. Empirical scientists couch their predictions as hypotheses and tend to deal with simple models such as regressions, but are modellers as much as are those who combine mechanistic hypotheses into more complex models. There are two main challenges for both groups: to strive for accurate predictions, and to ensure that the work is relevant to wider society. There is a role for blue-sky research, but the multiple environmental changes that characterise the 21st century place an onus on ecosystem scientists to develop tools for understanding environmental change and planning responses. Authors such as Funtowicz and Ravetz (1990) have argued that this situation represents "post-normal" science and that scientists should see themselves more humbly as actors within a societal process rather than as arbiters of truth. Modellers aim for generality, e.g. to accurately simulate the responses of a variety of ecosystems to several different environmental drivers. More accurate predictions can usually be achieved by including more explanatory factors or mechanisms in a model, even though this often results in a less efficient, less parsimonious model. This drives models towards ever-increasing complexity, and many models grow until they are effectively unusable beyond their development team. An alternative way forward is to focus on developing component models. Technologies for integrating dynamic models emphasise the removal of the model engine (algorithms) from code which handles time-stepping and the user interface. Developing components also requires some humility on the part of modellers, since collaboration will be needed to represent the whole system, and also because the idea that a simple component can or should represent the entire understanding of a scientific discipline is often difficult to accept. Policy-makers and land managers typically have questions different to those posed by scientists working within a specialism, and models

  5. How to design a financial planning model.

    PubMed

    Hayen, R L

    1983-10-01

    Corporate planning models frequently consist of integrated pro forma income statements, statements of financial position, and cashflow statements. When implemented by utilizing computer-based planning systems, these models allow managers to explore potential decisions in 'what if?' planning analyses. The logic of an integrated financial statement planning model can be arranged following either a 'funds needed to balance approach' or a 'direct approach'. With a funds needed to balance approach total assets are set equal to total liabilities plus equities to satisfy this fundamental accounting identity. Logic in such models is often difficult to validate. In the direct approach, total assets are calculated independently of total liabilities plus equities providing an extremely strong test for model validation prior to using the model to assess 'what if' alternatives. In this paper, the author discusses the logic of integrated financial planning models and their implementation with computer-based planning systems. The funds need to balance approach and the direct approach are described and contrasted to assist corporate planners in evaluating and selecting a method for constructing the logic of corporate planning models. PMID:10299309

  6. Designing Games for Sport Education: Curricular Models

    ERIC Educational Resources Information Center

    Holt, Brett

    2005-01-01

    Sports Education is becoming a popular alternative curricular model in physical education, opposing the more traditional Multi-activity model. Physical education classes are slowly changing to include sport education. The change comes with the support of the community in the form of Sport Education in Physical Education Program (SEPEP). However,…

  7. Evaluating Models for Partially Clustered Designs

    ERIC Educational Resources Information Center

    Baldwin, Scott A.; Bauer, Daniel J.; Stice, Eric; Rohde, Paul

    2011-01-01

    Partially clustered designs, where clustering occurs in some conditions and not others, are common in psychology, particularly in prevention and intervention trials. This article reports results from a simulation comparing 5 approaches to analyzing partially clustered data, including Type I errors, parameter bias, efficiency, and power. Results…

  8. Designing Public Library Websites for Teens: A Conceptual Model

    ERIC Educational Resources Information Center

    Naughton, Robin Amanda

    2012-01-01

    The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in…

  9. A Review of Research on Universal Design Educational Models

    ERIC Educational Resources Information Center

    Rao, Kavita; Ok, Min Wook; Bryant, Brian R.

    2014-01-01

    Universal design for learning (UDL) has gained considerable attention in the field of special education, acclaimed for its promise to promote inclusion by supporting access to the general curriculum. In addition to UDL, there are two other universal design (UD) educational models referenced in the literature, universal design of instruction (UDI)…

  10. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  11. STELLA Experiment: Design and Model Predictions

    SciTech Connect

    Kimura, W. D.; Babzien, M.; Ben-Zvi, I.; Campbell, L. P.; Cline, D. B.; Fiorito, R. B.; Gallardo, J. C.; Gottschalk, S. C.; He, P.; Kusche, K. P.; Liu, Y.; Pantell, R. H.; Pogorelsky, I. V.; Quimby, D. C.; Robinson, K. E.; Rule, D. W.; Sandweiss, J.; Skaritka, J.; van Steenbergen, A.; Steinhauer, L. C.; Yakimenko, V.

    1998-07-05

    The STaged ELectron Laser Acceleration (STELLA) experiment will be one of the first to examine the critical issue of staging the laser acceleration process. The BNL inverse free electron laser (EEL) will serve as a prebuncher to generate {approx} 1 {micro}m long microbunches. These microbunches will be accelerated by an inverse Cerenkov acceleration (ICA) stage. A comprehensive model of the STELLA experiment is described. This model includes the EEL prebunching, drift and focusing of the microbunches into the ICA stage, and their subsequent acceleration. The model predictions will be presented including the results of a system error study to determine the sensitivity to uncertainties in various system parameters.

  12. Irradiation Design for an Experimental Murine Model

    SciTech Connect

    Ballesteros-Zebadua, P.; Moreno-Jimenez, S.; Suarez-Campos, J. E.; Celis, M. A.; Larraga-Gutierrez, J. M.; Garcia-Garduno, O. A.; Rubio-Osornio, M. C.; Custodio-Ramirez, V.; Paz, C.

    2010-12-07

    In radiotherapy and stereotactic radiosurgery, small animal experimental models are frequently used, since there are still a lot of unsolved questions about the biological and biochemical effects of ionizing radiation. This work presents a method for small-animal brain radiotherapy compatible with a dedicated 6MV Linac. This rodent model is focused on the research of the inflammatory effects produced by ionizing radiation in the brain. In this work comparisons between Pencil Beam and Monte Carlo techniques, were used in order to evaluate accuracy of the calculated dose using a commercial planning system. Challenges in this murine model are discussed.

  13. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  14. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  15. New model performance index for engineering design of control systems

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Performance index includes a model representing linear control-system design specifications. Based on a geometric criterion for approximation of the model by the actual system, the index can be interpreted directly in terms of the desired system response model without actually having the model's time response.

  16. Multimedia Learning Design Pedagogy: A Hybrid Learning Model

    ERIC Educational Resources Information Center

    Tsoi, Mun Fie; Goh, Ngoh Khang; Chia, Lian Sai

    2005-01-01

    This paper provides insights on a hybrid learning model for multimedia learning design conceptualized from the Piagetian science learning cycle model and the Kolb's experiential learning model. This model represents learning as a cognitive process in a cycle of four phases, namely, Translating, Sculpting, Operationalizing, and Integrating and is…

  17. [Safety culture: definition, models and design].

    PubMed

    Pfaff, Holger; Hammer, Antje; Ernstmann, Nicole; Kowalski, Christoph; Ommen, Oliver

    2009-01-01

    Safety culture is a multi-dimensional phenomenon. Safety culture of a healthcare organization is high if it has a common stock in knowledge, values and symbols in regard to patients' safety. The article intends to define safety culture in the first step and, in the second step, demonstrate the effects of safety culture. We present the model of safety behaviour and show how safety culture can affect behaviour and produce safe behaviour. In the third step we will look at the causes of safety culture and present the safety-culture-model. The main hypothesis of this model is that the safety culture of a healthcare organization strongly depends on its communication culture and its social capital. Finally, we will investigate how the safety culture of a healthcare organization can be improved. Based on the safety culture model six measures to improve safety culture will be presented. PMID:19998775

  18. The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers

    ERIC Educational Resources Information Center

    Bird, Len

    2007-01-01

    This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…

  19. Model assessment of protective barrier designs

    SciTech Connect

    Fayer, M.J.; Conbere, W.; Heller, P.R.; Gee, G.W.

    1985-11-01

    A protective barrier is being considered for use at the Hanford site to enhance the isolation of previously disposed radioactive wastes from infiltrating water, and plant and animal intrusion. This study is part of a research and development effort to design barriers and evaluate their performance in preventing drainage. A fine-textured soil (the Composite) was located on the Hanford site in sufficient quantity for use as the top layer of the protective barrier. A number of simulations were performed by Pacific Northwest Laboratory to analyze different designs of the barrier using the Composite soil as well as the finer-textured Ritzville silt loam and a slightly coarser soil (Coarse). Design variations included two rainfall rates (16.0 and 30.1 cm/y), the presence of plants, gravel mixed into the surface of the topsoil, an impermeable boundary under the topsoil, and moving the waste form from 10 to 20 m from the barrier edge. The final decision to use barriers for enhanced isolation of previously disposed wastes will be subject to decisions resulting from the completion of the Hanford Defense Waste Environmental Impact Statement, which addresses disposal of Hanford defense high-level and transuranic wastes. The one-dimensional simulation results indicate that each of the three soils, when used as the top layer of the protective barrier, can prevent drainage provided plants are present. Gravel amendments to the upper 30 cm of soil (without plants) reduced evaporation and allowed more water to drain.

  20. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. PMID:27292581

  1. Development, Evaluation, and Design Applications of an AMTEC Converter Model

    NASA Astrophysics Data System (ADS)

    Spence, Cliff A.; Schuller, Michael; Lalk, Tom R.

    2003-01-01

    Issues associated with the development of an alkali metal thermal-to-electric conversion (AMTEC) converter model that serves as an effective design tool were investigated. The requirements and performance prediction equations for the model were evaluated, and a modeling methodology was established. It was determined by defining the requirements and equations for the model and establishing a methodology that Thermal Desktop, a recently improved finite-difference software package, could be used to develop a model that serves as an effective design tool. Implementing the methodology within Thermal Desktop provides stability, high resolution, modular construction, easy-to-use interfaces, and modeling flexibility.

  2. Robust power system controller design based on measured models

    SciTech Connect

    Fatehi, F.; Smith, J.R.; Pierre, D.A.

    1996-05-01

    This paper presents combined system identification and controller design methods to dampen low-frequency oscillations in multimachine power systems. An iterative closed-loop identification method is used to find a linear model for the power system. Linear quadratic Gaussian controller design with loop transfer recovery (LQG/LTR), based on a generalized technique for the nonminimum phase (NMP) power system model, is used to design controllers. Simulation results are presented to demonstrate the robustness of controllers based on closed-loop identified plant models and the amount of loop transfer recovery that is possible for NMP plant models.

  3. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  4. THE RHIC/AGS ONLINE MODEL ENVIRONMENT: DESIGN AND OVERVIEW.

    SciTech Connect

    SATOGATA,T.; BROWN,K.; PILAT,F.; TAFTI,A.A.; TEPIKIAN,S.; VAN ZEIJTS,J.

    1999-03-29

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV [1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters [2] around core computational modeling engines such as MAD and UAL/Teapot++ [3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC.

  5. Designing Quality Service: The Service Excellence Model.

    ERIC Educational Resources Information Center

    Ellicott, Michael A.; Conard, Rodney J.

    1997-01-01

    Recent experiences of manufacturing and commercial service industries provide insights to college facilities managers for combining downsizing with quality improvement. The Service Excellence Model emphasizes creation of shared responsibility, focus on core service processes, empowerment of cross-functional process-improvement teams, performance…

  6. Stimulus design for model selection and validation in cell signaling.

    PubMed

    Apgar, Joshua F; Toettcher, Jared E; Endy, Drew; White, Forest M; Tidor, Bruce

    2008-02-01

    Mechanism-based chemical kinetic models are increasingly being used to describe biological signaling. Such models serve to encapsulate current understanding of pathways and to enable insight into complex biological processes. One challenge in model development is that, with limited experimental data, multiple models can be consistent with known mechanisms and existing data. Here, we address the problem of model ambiguity by providing a method for designing dynamic stimuli that, in stimulus-response experiments, distinguish among parameterized models with different topologies, i.e., reaction mechanisms, in which only some of the species can be measured. We develop the approach by presenting two formulations of a model-based controller that is used to design the dynamic stimulus. In both formulations, an input signal is designed for each candidate model and parameterization so as to drive the model outputs through a target trajectory. The quality of a model is then assessed by the ability of the corresponding controller, informed by that model, to drive the experimental system. We evaluated our method on models of antibody-ligand binding, mitogen-activated protein kinase (MAPK) phosphorylation and de-phosphorylation, and larger models of the epidermal growth factor receptor (EGFR) pathway. For each of these systems, the controller informed by the correct model is the most successful at designing a stimulus to produce the desired behavior. Using these stimuli we were able to distinguish between models with subtle mechanistic differences or where input and outputs were multiple reactions removed from the model differences. An advantage of this method of model discrimination is that it does not require novel reagents, or altered measurement techniques; the only change to the experiment is the time course of stimulation. Taken together, these results provide a strong basis for using designed input stimuli as a tool for the development of cell signaling models. PMID

  7. Designers Workbench: Towards Real-Time Immersive Modeling

    SciTech Connect

    Kuester, F; Duchaineau, M A; Hamann, B; Joy, K I; Ma, K L

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technology or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  8. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  9. On Optimal Input Design and Model Selection for Communication Channels

    SciTech Connect

    Li, Yanyan; Djouadi, Seddik M; Olama, Mohammed M

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  10. Designing a flexible grid enabled scientific modeling interface.

    SciTech Connect

    Dvorak, M.; Taylor, J.; Mickelson, S.

    2002-08-15

    The Espresso Scientific Modeling Interface (Espresso) is a scientific modeling productivity tool developed from climate modelers. Espresso was designed to be an extensible interface to both scientific models and Grid resources. It also aims to be a contemporary piece of software that relies on Globus.org's Java CoG Kit for a Grid toolkit, Sun's Java 2 API and is configured using XML. This article covers the design implementation of Espresso's Grid functionality and how it interacts with existing scientific models. The authors give specific examples of how they have designed Espresso to perform climate simulations using the PSU/NCAR MM5 atmospheric model. Plans to incorporate the CCSM and FOAM climate models are also discussed.

  11. Design Approaches to Support Preservice Teachers in Scientific Modeling

    NASA Astrophysics Data System (ADS)

    Kenyon, Lisa; Davis, Elizabeth A.; Hug, Barbara

    2011-02-01

    Engaging children in scientific practices is hard for beginning teachers. One such scientific practice with which beginning teachers may have limited experience is scientific modeling. We have iteratively designed preservice teacher learning experiences and materials intended to help teachers achieve learning goals associated with scientific modeling. Our work has taken place across multiple years at three university sites, with preservice teachers focused on early childhood, elementary, and middle school teaching. Based on results from our empirical studies supporting these design decisions, we discuss design features of our modeling instruction in each iteration. Our results suggest some successes in supporting preservice teachers in engaging students in modeling practice. We propose design principles that can guide science teacher educators in incorporating modeling in teacher education.

  12. Jet Pump Design Optimization by Multi-Surrogate Modeling

    NASA Astrophysics Data System (ADS)

    Mohan, S.; Samad, A.

    2015-01-01

    A basic approach to reduce the design and optimization time via surrogate modeling is to select a right type of surrogate model for a particular problem, where the model should have better accuracy and prediction capability. A multi-surrogate approach can protect a designer to select a wrong surrogate having high uncertainty in the optimal zone of the design space. Numerical analysis and optimization of a jet pump via multi-surrogate modeling have been reported in this work. Design variables including area ratio, mixing tube length to diameter ratio and setback ratio were introduced to increase the hydraulic efficiency of the jet pump. Reynolds-averaged Navier-Stokes equations were solved and responses were computed. Among different surrogate models, Sheppard function based surrogate shows better accuracy in data fitting while the radial basis neural network produced highest enhanced efficiency. The efficiency enhancement was due to the reduction of losses in the flow passage.

  13. Modeling Programs Increase Aircraft Design Safety

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Flutter may sound like a benign word when associated with a flag in a breeze, a butterfly, or seaweed in an ocean current. When used in the context of aerodynamics, however, it describes a highly dangerous, potentially deadly condition. Consider the case of the Lockheed L-188 Electra Turboprop, an airliner that first took to the skies in 1957. Two years later, an Electra plummeted to the ground en route from Houston to Dallas. Within another year, a second Electra crashed. In both cases, all crew and passengers died. Lockheed engineers were at a loss as to why the planes wings were tearing off in midair. For an answer, the company turned to NASA s Transonic Dynamics Tunnel (TDT) at Langley Research Center. At the time, the newly renovated wind tunnel offered engineers the capability of testing aeroelastic qualities in aircraft flying at transonic speeds near or just below the speed of sound. (Aeroelasticity is the interaction between aerodynamic forces and the structural dynamics of an aircraft or other structure.) Through round-the-clock testing in the TDT, NASA and industry researchers discovered the cause: flutter. Flutter occurs when aerodynamic forces acting on a wing cause it to vibrate. As the aircraft moves faster, certain conditions can cause that vibration to multiply and feed off itself, building to greater amplitudes until the flutter causes severe damage or even the destruction of the aircraft. Flutter can impact other structures as well. Famous film footage of the Tacoma Narrows Bridge in Washington in 1940 shows the main span of the bridge collapsing after strong winds generated powerful flutter forces. In the Electra s case, faulty engine mounts allowed a type of flutter known as whirl flutter, generated by the spinning propellers, to transfer to the wings, causing them to vibrate violently enough to tear off. Thanks to the NASA testing, Lockheed was able to correct the Electra s design flaws that led to the flutter conditions and return the

  14. Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.

    ERIC Educational Resources Information Center

    Baker, Richard

    A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…

  15. Robust Design of Motor PWM Control using Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Zhan, Wei

    A robust design method is developed for Pulse Width Modulation (PWM) motor speed control. A first principle model for DC permanent magnetic motor is used to build a Simulink model for simulation and analysis. Based on the simulation result, the main factors that contributed to the average speed variation are identified using Design of Experiment (DOE). A robust solution is derived to reduce the aver age speed control variation using Response Surface Method (RSM). The robustness of the new design is verified using the simulation model.

  16. Second generation thermal imaging system design trades modeling

    NASA Astrophysics Data System (ADS)

    Vroombout, Leo O.

    1990-10-01

    The Night Vision Laboratory static performance model is considered for thermal viewing systems. Since the model is not initially intended to be a design tool and is not usable for conducting system or component design trades, it has to be restructured. The approach to updating the first-generation static performance model and to configuring it as a design tool is presented. Second-generation imaging systems exploit infrared focal-plane arrays, high-reliability cryogenic coolers, precision scanning devices, and high-speed digital electronics. They also use optical materials and coatings and optomechanical and electronics packaging techniques.

  17. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  18. A Model-Based Expert System For Digital Systems Design

    NASA Astrophysics Data System (ADS)

    Wu, J. G.; Ho, W. P. C.; Hu, Y. H.; Yun, D. Y. Y.; Parng, T. M.

    1987-05-01

    In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.

  19. An automobile air conditioner design model

    SciTech Connect

    Kyle, D M; Mei, V C; Chen, F C

    1992-12-01

    A computer program has been developed to predict the steady-state performance of vapor compression automobile air conditioners and heat pumps. The code is based on the residential heat pump model developed at the Oak Ridge National Laboratory (ORNL). Most calculations are based on fundamental physical principles, in conjunction with generalized correlations available in the research literature. Automobile air conditioning components that can be specified as input to the program include open and hermetic compressors; finned tube condensers; finned tube and plate-fin style evaporators; thermostatic expansion valves (TXV), capillary tube, and short tube expansion devices; refrigerant mass; and evaporator pressure regulator and all interconnecting tubing. Pressure drop, heat transfer rates, and latent capacity ratio for the new plate-fin evaporator submodel are shown to agree well with laboratory data. The program can be used with a variety of refrigerants, including R-134a.

  20. Hotspot detection and design recommendation using silicon calibrated CMP model

    NASA Astrophysics Data System (ADS)

    Hui, Colin; Wang, Xian Bin; Huang, Haigou; Katakamsetty, Ushasree; Economikos, Laertis; Fayaz, Mohammed; Greco, Stephen; Hua, Xiang; Jayathi, Subramanian; Yuan, Chi-Min; Li, Song; Mehrotra, Vikas; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Smith, Taber

    2009-03-01

    Chemical Mechanical Polishing (CMP) has been used in the manufacturing process for copper (Cu) damascene process. It is well known that dishing and erosion occur during CMP process, and they strongly depend on metal density and line width. The inherent thickness and topography variations become an increasing concern for today's designs running through advanced process nodes (sub 65nm). Excessive thickness and topography variations can have major impacts on chip yield and performance; as such they need to be accounted for during the design stage. In this paper, we will demonstrate an accurate physics based CMP model and its application for CMP-related hotspot detection. Model based checking capability is most useful to identify highly environment sensitive layouts that are prone to early process window limitation and hence failure. Model based checking as opposed to rule based checking can identify more accurately the weak points in a design and enable designers to provide improved layout for the areas with highest leverage for manufacturability improvement. Further, CMP modeling has the ability to provide information on interlevel effects such as copper puddling from underlying topography that cannot be captured in Design-for- Manufacturing (DfM) recommended rules. The model has been calibrated against the silicon produced with the 45nm process from Common Platform (IBMChartered- Samsung) technology. It is one of the earliest 45nm CMP models available today. We will show that the CMP-related hotspots can often occur around the spaces between analog macros and digital blocks in the SoC designs. With the help of the CMP model-based prediction, the design, the dummy fill or the placement of the blocks can be modified to improve planarity and eliminate CMP-related hotspots. The CMP model can be used to pass design recommendations to designers to improve chip yield and performance.

  1. Multiscale Modeling in the Clinic: Drug Design and Development.

    PubMed

    Clancy, Colleen E; An, Gary; Cannon, William R; Liu, Yaling; May, Elebeoba E; Ortoleva, Peter; Popel, Aleksander S; Sluka, James P; Su, Jing; Vicini, Paolo; Zhou, Xiaobo; Eckmann, David M

    2016-09-01

    A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multiscale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multiscale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions to guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multiscale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical and computational techniques employed for multiscale modeling approaches used in pharmacometric and systems pharmacology models in drug development and present several examples illustrating the current state-of-the-art models for (1) excitable systems and applications in cardiac disease; (2) stem cell driven complex biosystems; (3) nanoparticle delivery, with applications to angiogenesis and cancer therapy; (4) host-pathogen interactions and their use in metabolic disorders, inflammation and sepsis; and (5) computer-aided design of nanomedical systems. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multiscale models. PMID:26885640

  2. Experimental Design and Multiplexed Modeling Using Titrimetry and Spreadsheets

    NASA Astrophysics Data System (ADS)

    Harrington, Peter De B.; Kolbrich, Erin; Cline, Jennifer

    2002-07-01

    The topics of experimental design and modeling are important for inclusion in the undergraduate curriculum. Many general chemistry and quantitative analysis courses introduce students to spreadsheet programs, such as MS Excel. Students in the laboratory sections of these courses use titrimetry as a quantitative measurement method. Unfortunately, the only model that students may be exposed to in introductory chemistry courses is the working curve that uses the linear model. A novel experiment based on a multiplex model has been devised for titrating several vinegar samples at a time. The multiplex titration can be applied to many other routine determinations. An experimental design model is fit to titrimetric measurements using the MS Excel LINEST function to estimate concentration from each sample. This experiment provides valuable lessons in error analysis, Class A glassware tolerances, experimental simulation, statistics, modeling, and experimental design.

  3. Modeling Real-Time Applications with Reusable Design Patterns

    NASA Astrophysics Data System (ADS)

    Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik

    Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.

  4. Robust control design verification using the modular modeling system

    SciTech Connect

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem.

  5. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  6. Honors biomedical instrumentation--a course model for accelerated design.

    PubMed

    Madhok, Jai; Smith, Ryan J; Thakor, Nitish V

    2009-01-01

    A model for a 16-week Biomedical Instrumentation course is outlined. The course is modeled in such a way that students learn about medical devices and instrumentation through lecture and laboratory sessions while also learning basic design principles. Course material covers a broad range of topics from fundamentals of sensors and instrumentation, guided laboratory design experiments, design projects, and eventual protection of intellectual property, regulatory considerations, and entry into the commercial market. Students eventually complete two design projects in the form of a 'Challenge' design project as well as an 'Honors' design project. Sample problems students solve during the Challenge project and examples of past Honors projects from the course are highlighted. PMID:19964766

  7. Designers' models of the human-computer interface

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Breedin, Sarah D.

    1993-01-01

    Understanding design models of the human-computer interface (HCI) may produce two types of benefits. First, interface development often requires input from two different types of experts: human factors specialists and software developers. Given the differences in their backgrounds and roles, human factors specialists and software developers may have different cognitive models of the HCI. Yet, they have to communicate about the interface as part of the design process. If they have different models, their interactions are likely to involve a certain amount of miscommunication. Second, the design process in general is likely to be guided by designers' cognitive models of the HCI, as well as by their knowledge of the user, tasks, and system. Designers do not start with a blank slate; rather they begin with a general model of the object they are designing. The author's approach to a design model of the HCI was to have three groups make judgments of categorical similarity about the components of an interface: human factors specialists with HCI design experience, software developers with HCI design experience, and a baseline group of computer users with no experience in HCI design. The components of the user interface included both display components such as windows, text, and graphics, and user interaction concepts, such as command language, editing, and help. The judgments of the three groups were analyzed using hierarchical cluster analysis and Pathfinder. These methods indicated, respectively, how the groups categorized the concepts, and network representations of the concepts for each group. The Pathfinder analysis provides greater information about local, pairwise relations among concepts, whereas the cluster analysis shows global, categorical relations to a greater extent.

  8. Modeling and observer design for recombinant Escherichia coli strain.

    PubMed

    Nadri, M; Trezzani, I; Hammouri, H; Dhurjati, P; Longin, R; Lieto, J

    2006-03-01

    A mathematical model for recombinant bacteria which includes foreign protein production is developed. The experimental system consists of an Escherichia Coli strain and plasmid pIT34 containing genes for bioluminescence and production of a protein, beta-galactosidase. This recombinant strain is constructed to facilitate on-line estimation and control in a complex bioprocess. Several batch experiments are designed and performed to validate the developed model. The design of a model structure, the identification of the model parameters and the estimation problem are three parts of a joint design problem. A nonlinear observer is designed and an experimental evaluation is performed on a batch fermentation process to estimate the substrate consumption. PMID:16411071

  9. 11. MOVABLE BED SEDIMENTATION MODELS. AUTOMATIC SEDIMENT FEEDER DESIGNED AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. MOVABLE BED SEDIMENTATION MODELS. AUTOMATIC SEDIMENT FEEDER DESIGNED AND BUILT BY WES. - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS

  10. Designing multifocal corneal models to correct presbyopia by laser ablation

    NASA Astrophysics Data System (ADS)

    Alarcón, Aixa; Anera, Rosario G.; Del Barco, Luis Jiménez; Jiménez, José R.

    2012-01-01

    Two multifocal corneal models and an aspheric model designed to correct presbyopia by corneal photoablation were evaluated. The design of each model was optimized to achieve the best visual quality possible for both near and distance vision. In addition, we evaluated the effect of myosis and pupil decentration on visual quality. The corrected model with the central zone for near vision provides better results since it requires less ablated corneal surface area, permits higher addition values, presents stabler visual quality with pupil-size variations and lower high-order aberrations.

  11. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  12. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  13. Toward a Contingency Model for Designing Interorganizational Service Delivery Systems

    ERIC Educational Resources Information Center

    Whetten, David A.

    1977-01-01

    Presents a model for designing interorganizational coordination (IOC) systems, based on the premise that different contexts support varying degrees of interorganizational coordination. Identifies key contextual dimensions that administrators should consider in designing an IOC program and suggests guidelines for selecting the appropriate level of…

  14. Design of spatial experiments: Model fitting and prediction

    SciTech Connect

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  15. Fourth Generation Instructional Design Model: An Elaboration on Authoring Activities.

    ERIC Educational Resources Information Center

    Christensen, Dean L.

    This paper presents the updated (fourth generation) version of the instructional design (ID) model, noting its emphasis on a scientific, iterative approach based upon research and theory in learning and instruction and upon applied development experience. Another important trend toward a scientific approach to instructional design is the increased…

  16. Organizational Learning and Product Design Management: Towards a Theoretical Model.

    ERIC Educational Resources Information Center

    Chiva-Gomez, Ricardo; Camison-Zornoza, Cesar; Lapiedra-Alcami, Rafael

    2003-01-01

    Case studies of four Spanish ceramics companies were used to construct a theoretical model of 14 factors essential to organizational learning. One set of factors is related to the conceptual-analytical phase of the product design process and the other to the creative-technical phase. All factors contributed to efficient product design management…

  17. Cross-Disciplinary Contributions to E-Learning Design: A Tripartite Design Model

    ERIC Educational Resources Information Center

    Hutchins, Holly M.; Hutchison, Dennis

    2008-01-01

    Purpose: The purpose of this paper is to review cross-disciplinary research on e-learning from workplace learning, educational technology, and instructional communication disciplines to identify relevant e-learning design principles. It aims to use these principles to propose an e-learning model that can guide the design of instructionally sound,…

  18. Creativity in the Training and Practice of Instructional Designers: The Design/Creativity Loops Model

    ERIC Educational Resources Information Center

    Clinton, Gregory; Hokanson, Brad

    2012-01-01

    This article presents a discussion of research and theoretical perspectives on creativity and instructional design, offering a conceptual model of the connection between these two constructs that was originally proposed in the dissertation work of the first author (Clinton, Creativity and design: A study of the learning experience of instructional…

  19. Radiation Belt Modeling for Spacecraft Design: Model Comparisons for Common Orbits

    NASA Technical Reports Server (NTRS)

    Lauenstein, J.-M.; Barth, J. L.

    2005-01-01

    We present the current status of radiation belt modeling, providing model details and comparisons with AP-8 and AE-8 for commonly used orbits. Improved modeling of the particle environment enables smarter space system design.

  20. Control system design for flexible structures using data models

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Frazier, W. Garth; Mitchell, Jerrel R.; Medina, Enrique A.; Bukley, Angelia P.

    1993-01-01

    The dynamics and control of flexible aerospace structures exercises many of the engineering disciplines. In recent years there has been considerable research in the developing and tailoring of control system design techniques for these structures. This problem involves designing a control system for a multi-input, multi-output (MIMO) system that satisfies various performance criteria, such as vibration suppression, disturbance and noise rejection, attitude control and slewing control. Considerable progress has been made and demonstrated in control system design techniques for these structures. The key to designing control systems for these structures that meet stringent performance requirements is an accurate model. It has become apparent that theoretically and finite-element generated models do not provide the needed accuracy; almost all successful demonstrations of control system design techniques have involved using test results for fine-tuning a model or for extracting a model using system ID techniques. This paper describes past and ongoing efforts at Ohio University and NASA MSFC to design controllers using 'data models.' The basic philosophy of this approach is to start with a stabilizing controller and frequency response data that describes the plant; then, iteratively vary the free parameters of the controller so that performance measures become closer to satisfying design specifications. The frequency response data can be either experimentally derived or analytically derived. One 'design-with-data' algorithm presented in this paper is called the Compensator Improvement Program (CIP). The current CIP designs controllers for MIMO systems so that classical gain, phase, and attenuation margins are achieved. The center-piece of the CIP algorithm is the constraint improvement technique which is used to calculate a parameter change vector that guarantees an improvement in all unsatisfied, feasible performance metrics from iteration to iteration. The paper also

  1. Plasma Response Models for Controller Design on TCV

    NASA Astrophysics Data System (ADS)

    Lister, J. B.; Vyas, P.; Albanese, R.; Ambrosino, G.; Ariola, M.; Villone, F.; Coutlis, A.; Limebeer, D. J. N.; Wainwright, J. P.

    1997-11-01

    The control of the plasma position and shape on present tokamaks is usually based on simple but reliable PID controllers. These controllers are either empirically tuned or based on simplistic models. More detailed models could be exploited by modern control theory to benefit the controller design, since the improvement in performance depends on the accuracy of the model. Linearized models of the plasma shape and position have been developed for TCV limited and diverted plasmas. These include a simple rigid current displacement model for zIp and the CREATE-L model for position and shape. The latter is an a priori phenomenological model which assumes that the plasma is in permanent MHD equilibrium and that the current profile is determined by only l_i, β_p, and I_p. Variations of the CREATE-L model based on different assumptions are also tested. A purely mathematical model developed from experimental observations on TCV was also developed. The accuracy and consistency of these models has been extensively tested on TCV and the CREATE-L model is in excellent agreement with open and closed loop experiments. The implications for controller design on TCV, and the suitability of these models for ITER controller design is assessed.

  2. Early Design Choices: Capture, Model, Integrate, Analyze, Simulate

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2004-01-01

    I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.

  3. Ohio River navigation investment model: Requirements and model design

    SciTech Connect

    Bronzini, M.S.; Curlee, T.R.; Leiby, P.N.; Southworth, F.; Summers, M.S.

    1998-01-01

    Oak Ridge National Laboratory is assisting the US Army Corps of Engineers in improving its economic analysis procedures for evaluation of inland waterway investment projects along the Ohio River System. This paper describes the context and design of an integrated approach to calculating the system-wide benefits from alternative combinations of lock and channel improvements, providing an ability to project the cost savings from proposed waterway improvements in capacity and reliability for up to fifty years into the future. The design contains an in-depth treatment of the levels of risk and uncertainty associated with different multi-year lock and channel improvement plans, including the uncertainty that results from a high degree of interaction between the many different waterway system components.

  4. From Physical Models to Biomechanics: A Design-Based Modeling Approach.

    ERIC Educational Resources Information Center

    Penner, David E.; Schauble, Leona; Lehrer, Richard

    1998-01-01

    A design context for developing children's understanding of the natural world via designing, building, testing, and evaluation of models was studied. Children were asked to design models of human elbows which provided opportunities to develop their understanding of the relations between mathematics and science through the construction and…

  5. Application of zonal model on indoor air sensor network design

    NASA Astrophysics Data System (ADS)

    Chen, Y. Lisa; Wen, Jin

    2007-04-01

    Growing concerns over the safety of the indoor environment have made the use of sensors ubiquitous. Sensors that detect chemical and biological warfare agents can offer early warning of dangerous contaminants. However, current sensor system design is more informed by intuition and experience rather by systematic design. To develop a sensor system design methodology, a proper indoor airflow modeling approach is needed. Various indoor airflow modeling techniques, from complicated computational fluid dynamics approaches to simplified multi-zone approaches, exist in the literature. In this study, the effects of two airflow modeling techniques, multi-zone modeling technique and zonal modeling technique, on indoor air protection sensor system design are discussed. Common building attack scenarios, using a typical CBW agent, are simulated. Both multi-zone and zonal models are used to predict airflows and contaminant dispersion. Genetic Algorithm is then applied to optimize the sensor location and quantity. Differences in the sensor system design resulting from the two airflow models are discussed for a typical office environment and a large hall environment.

  6. Design of single object model of software reuse framework

    NASA Astrophysics Data System (ADS)

    Yan, Liu

    2011-12-01

    In order to embody the reuse significance of software reuse framework fully, this paper will analyze in detail about the single object model mentioned in the article "The overall design of software reuse framework" and induce them as add and delete and modify mode, check mode, and search and scroll and display integrated mode. Three modes correspond to their own interface design template, class and database design concept. The modelling idea helps developers clear their minds and speed up. Even laymen can complete the development task easily.

  7. Civil tiltrotor transport point design: Model 940A

    NASA Technical Reports Server (NTRS)

    Rogers, Charles; Reisdorfer, Dale

    1993-01-01

    The objective of this effort is to produce a vehicle layout for the civil tiltrotor wing and center fuselage in sufficient detail to obtain aerodynamic and inertia loads for determining member sizing. This report addresses the parametric configuration and loads definition for a 40 passenger civil tilt rotor transport. A preliminary (point) design is developed for the tiltrotor wing box and center fuselage. This summary report provides all design details used in the pre-design; provides adequate detail to allow a preliminary design finite element model to be developed; and contains guidelines for dynamic constraints.

  8. Gaussian beam ray-equivalent modeling and optical design.

    PubMed

    Herloski, R; Marshall, S; Antos, R

    1983-04-15

    It is shown that the propagation and transformation of a simply astigmatic Gaussian beam by an optical system with a characteristic ABCD matrix can be modeled by relatively simple equations whose terms consist solely of the heights and slopes of two paraxial rays. These equations are derived from the ABCD law of Gaussian beam transformation. They can be used in conjunction with a conventional automatic optical design program to design and optimize Gaussian beam optical systems. Several design examples are given using the CODE-V optical design package. PMID:18195936

  9. Applying learning theories and instructional design models for effective instruction.

    PubMed

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. PMID:27068989

  10. Controller design via structural reduced modeling by FETM

    NASA Technical Reports Server (NTRS)

    Yousuff, Ajmal

    1987-01-01

    The Finite Element-Transfer Matrix (FETM) method has been developed to reduce the computations involved in analysis of structures. This widely accepted method, however, has certain limitations, and does not address the issues of control design. To overcome these, a modification of the FETM method has been developed. The new method easily produces reduced models tailored toward subsequent control design. Other features of this method are its ability to: (1) extract open loop frequencies and mode shapes with less computations, (2) overcome limitations of the original FETM method, and (3) simplify the design procedures for output feedback, constrained compensation, and decentralized control. This report presents the development of the new method, generation of reduced models by this method, their properties, and the role of these reduced models in control design. Examples are included to illustrate the methodology.

  11. Evolution of Geometric Sensitivity Derivatives from Computer Aided Design Models

    NASA Technical Reports Server (NTRS)

    Jones, William T.; Lazzara, David; Haimes, Robert

    2010-01-01

    The generation of design parameter sensitivity derivatives is required for gradient-based optimization. Such sensitivity derivatives are elusive at best when working with geometry defined within the solid modeling context of Computer-Aided Design (CAD) systems. Solid modeling CAD systems are often proprietary and always complex, thereby necessitating ad hoc procedures to infer parameter sensitivity. A new perspective is presented that makes direct use of the hierarchical associativity of CAD features to trace their evolution and thereby track design parameter sensitivity. In contrast to ad hoc methods, this method provides a more concise procedure following the model design intent and determining the sensitivity of CAD geometry directly to its respective defining parameters.

  12. A modal test design strategy for model correlation

    SciTech Connect

    Carne, T.G.; Dohrmann, C.R.

    1994-12-01

    When a modal test is to be performed for purposes of correlation with a finite element model, one needs to design the test so that the resulting measurements will provide the data needed for the correlation. There are numerous issues to consider in the design of a modal test; two important ones are the number and location of response sensors, and the number, location, and orientation of input excitation. From a model correlation perspective, one would like to select the response locations to allow a definitive, one-to-one correspondence between the measured modes and the predicted modes. Further, the excitation must be designed to excite all the modes of interest at a sufficiently high level so that the modal estimation algorithms can accurately extract the modal parameters. In this paper these two issues are examined in the context of model correlation with methodologies presented for obtaining an experiment design.

  13. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  14. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  15. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  16. PARTICULATE CONTROL HIGHLIGHTS: PERFORMANCE AND DESIGN MODEL FOR SCRUBBERS

    EPA Science Inventory

    The report gives a capsule summary of the best available design models for wet scrubbers and their application to fine particulate control. Details of the models are reported in the Scrubber Handbook and other EPA publications listed in the bibliography. When EPA initiated its We...

  17. Aspect-Oriented Design with Reusable Aspect Models

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Al Abed, Wisam; Fleurey, Franck; Jézéquel, Jean-Marc; Klein, Jacques

    The idea behind Aspect-Oriented Modeling (AOM) is to apply aspect-oriented techniques to (software) models with the aim of modularizing crosscutting concerns. This can be done within different modeling notations, at different levels of abstraction, and at different moments during the software development process. This paper demonstrates the applicability of AOM during the software design phase by presenting parts of an aspect-oriented design of a crisis management system. The design solution proposed in this paper is based on the Reusable Aspect Models (RAM) approach, which allows a modeler to express the structure and behavior of a complex system using class, state and sequence diagrams encapsulated in several aspect models. The paper describes how the model of the "create mission" functionality of the server backend can be decomposed into 23 inter-dependent aspect models. The presentation of the design is followed by a discussion on the lessons learned from the case study. Next, RAM is compared to 8 other AOM approaches according to 6 criteria: language, concern composition, asymmetric and symmetric composition, maturity, and tool support. To conclude the paper, a discussion section points out the features of RAM that specifically support reuse.

  18. Freshman Interest Groups: Designing a Model for Success

    ERIC Educational Resources Information Center

    Ratliff, Gerald Lee

    2008-01-01

    Freshman Interest Groups (FIGS) have become a popular model for academic and student affairs colleagues who are concerned that first-year students learn to reflect on life experiences and daily events as part of the learning process. A well-designed FIG model meets the academic, social and career concerns for first-year students by providing an…

  19. Evaluating Instructional Design Models: A Proposed Research Approach

    ERIC Educational Resources Information Center

    Gropper, George L.

    2015-01-01

    Proliferation of prescriptive models in an "engineering" field is not a sign of its maturity. Quite the opposite. Materials engineering, for example, meets the criterion of parsimony. Sadly, the very large number of models in "instructional design," putatively an engineering field, raises questions about its status. Can the…

  20. Communications network design and costing model programmers manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    Otpimization algorithms and techniques used in the communications network design and costing model for least cost route and least cost network problems are examined from the programmer's point of view. All system program modules, the data structures within the model, and the files which make up the data base are described.

  1. Transitioning from Software Requirements Models to Design Models

    NASA Technical Reports Server (NTRS)

    Whittle, Jon

    2004-01-01

    The Scenario Creation and Simulation Process (SCASP) includes the following steps: 1) Write Requirements; 2) Write Use Cases; 3) Prioritize Use Cases; 4) Write Nominal Scenarios; 5) Identify Relationships; 6) Refine/Generalize Scenarios; 7) Transform to State Machines. SCASP provides thorough simulation of use cases before design/implementation, resulting in: 1) Reduced cost; 2) Fewer misunderstandings; 3) Reuse of executable form of use cases. SCASP gives systematic guidelines on how to 1) Separate concerns in use case descriptions; 2) Elicit non-nominal scenarios (alternatives, exceptions, concurrent scenarios, etc.); 3) Transform those scenarios automatically into a set of concurrent state machines; 4) Execute those state machines, i.e., scenario simulation.

  2. Models for hydrologic design of evapotranspiration landfill covers.

    PubMed

    Hauser, Victor L; Gimon, Dianna M; Bonta, James V; Howell, Terry A; Malone, Robert W; Williams, Jimmy R

    2005-09-15

    The technology used in landfill covers is changing, and an alternative cover called the evapotranspiration (ET) landfill cover is coming into use. Important design requirements are prescribed by Federal rules and regulations for conventional landfill covers but not for ET landfill covers. There is no accepted hydrologic model for ET landfill cover design. This paper describes ET cover requirements and design issues, and assesses the accuracy of the EPIC and HELP hydrologic models when used for hydrologic design of ET covers. We tested the models against high-quality field measurements available from lysimeters maintained by the Agricultural Research Service of the U.S. Department of Agriculture at Coshocton, Ohio, and Bushland, Texas. The HELP model produced substantial errors in estimating hydrologic variables. The EPIC model estimated ET and deep percolation with errors less than 7% and 5%, respectively, and accurately matched extreme events with an error of less than 2% of precipitation. The EPIC model is suitable for use in hydrologic design of ET landfill covers. PMID:16201652

  3. Towards Comprehensive Variation Models for Designing Vehicle Monitoring Systems

    NASA Technical Reports Server (NTRS)

    McAdams, Daniel A.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes in a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. This crucial roadblock makes their implementation in real vehicles (e.g., helicopter transmissions and aircraft engines) difficult, making their operation costly and unreliable. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. Using such models, we develop a methodology to account for design and manufacturing variations, and explore the changes in the vibration response to determine its stochastic nature. We explore the potential of the methodology using a nonlinear cam-follower model, where the spring stiffness values are assumed to follow a normal distribution. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle monitoring systems.

  4. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single

  5. The engineering design process as a model for STEM curriculum design

    NASA Astrophysics Data System (ADS)

    Corbett, Krystal Sno

    Engaging pedagogics have been proven to be effective in the promotion of deep learning for science, technology, engineering, and mathematics (STEM) students. In many cases, academic institutions have shown a desire to improve education by implementing more engaging techniques in the classroom. The research framework established in this dissertation has been governed by the axiom that students should obtain a deep understanding of fundamental topics while being motivated to learn through engaging techniques. This research lays a foundation for future analysis and modeling of the curriculum design process where specific educational research questions can be considered using standard techniques. Further, a clear curriculum design process is a key step towards establishing an axiomatic approach for engineering education. A danger is that poor implementation of engaging techniques will counteract the intended effects. Poor implementation might provide students with a "fun" project, but not the desired deep understanding of the fundamental STEM content. Knowing that proper implementation is essential, this dissertation establishes a model for STEM curriculum design, based on the well-established engineering design process. Using this process as a perspective to model curriculum design allows for a structured approach. Thus, the framework for STEM curriculum design, established here, provides a guided approach for seamless integration of fundamental topics and engaging pedagogics. The main steps, or phases, in engineering design are: Problem Formulation, Solution Generation, Solution Analysis, and Solution Implementation. Layering engineering design with education curriculum theory, this dissertation establishes a clear framework for curriculum design. Through ethnographic engagement by this researcher, several overarching themes are revealed through the creation of curricula using the design process. The application of the framework to specific curricula was part of this

  6. Application of existing design software to problems in neuronal modeling.

    PubMed

    Vranić-Sowers, S; Fleshman, J W

    1994-03-01

    In this communication, we describe the application of the Valid/Analog Design Tools circuit simulation package called PC Workbench to the problem of modeling the electrical behavior of neural tissue. A nerve cell representation as an equivalent electrical circuit using compartmental models is presented. Several types of nonexcitable and excitable membranes are designed, and simulation results for different types of electrical stimuli are compared to the corresponding analytical data. It is shown that the hardware/software platform and the models developed constitute an accurate, flexible, and powerful way to study neural tissue. PMID:8045583

  7. Design modeling of lithium-ion battery performance

    NASA Astrophysics Data System (ADS)

    Nelson, Paul; Bloom, Ira; Amine, Khalil; Henriksen, Gary

    A computer design modeling technique has been developed for lithium-ion batteries to assist in setting goals for cell components, assessing materials requirements, and evaluating thermal management strategies. In this study, the input data for the model included design criteria from Quallion, LLC for Gen-2 18650 cells, which were used to test the accuracy of the dimensional modeling. Performance measurements on these cells were done at the electrochemical analysis and diagnostics laboratory (EADL) at Argonne National Laboratory. The impedance and capacity related criteria were calculated from the EADL measurements. Five batteries were designed for which the number of windings around the cell core was increased for each succeeding battery to study the effect of this variable upon the dimensions, weight, and performance of the batteries. The lumped-parameter battery model values were calculated for these batteries from the laboratory results, with adjustments for the current collection resistance calculated for the individual batteries.

  8. Design modeling of lithium-ion battery performance.

    SciTech Connect

    Nelson, P. A.; Bloom, I.; Amine, K.; Henriksen, G.; Chemical Engineering

    2002-08-22

    A computer design modeling technique has been developed for lithium-ion batteries to assist in setting goals for cell components, assessing materials requirements, and evaluating thermal management strategies. In this study, the input data for the model included design criteria from Quallion, LLC for Gen-2 18650 cells, which were used to test the accuracy of the dimensional modeling. Performance measurements on these cells were done at the electrochemical analysis and diagnostics laboratory (EADL) at Argonne National Laboratory. The impedance and capacity related criteria were calculated from the EADL measurements. Five batteries were designed for which the number of windings around the cell core was increased for each succeeding battery to study the effect of this variable upon the dimensions, weight, and performance of the batteries. The lumped-parameter battery model values were calculated for these batteries from the laboratory results, with adjustments for the current collection resistance calculated for the individual batteries.

  9. Future Modeling Needs in Pulse Detonation Rocket Engine Design

    NASA Technical Reports Server (NTRS)

    Meade, Brian; Talley, Doug; Mueller, Donn; Tew, Dave; Guidos, Mike; Seymour, Dave

    2001-01-01

    This paper presents a performance model rocket engine design that takes advantage of pulse detonation to generate thrust. The contents include: 1) Introduction to the Pulse Detonation Rocket Engine (PDRE); 2) PDRE modeling issues and options; 3) Discussion of the PDRE Performance Workshop held at Marshall Space Flight Center; and 4) Identify needs involving an open performance model for Pulse Detonation Rocket Engines. This paper is in viewgraph form.

  10. An aircraft model for the AIAA controls design challenge

    NASA Technical Reports Server (NTRS)

    Brumbaugh, Randal W.

    1991-01-01

    A generic, state-of-the-art, high-performance aircraft model, including detailed, full-envelope, nonlinear aerodynamics, and full-envelope thrust and first-order engine response data is described. While this model was primarily developed Controls Design Challenge, the availability of such a model provides a common focus for research in aeronautical control theory and methodology. An implementation of this model using the FORTRAN computer language, associated routines furnished with the aircraft model, and techniques for interfacing these routines to external procedures is also described. Figures showing vehicle geometry, surfaces, and sign conventions are included.

  11. An improved numerical model for wave rotor design and analysis

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Wilson, Jack

    1992-01-01

    A numerical model has been developed which can predict both the unsteady flows within a wave rotor and the steady averaged flows in the ports. The model is based on the assumptions of one-dimensional, unsteady, and perfect gas flow. Besides the dominant wave behavior, it is also capable of predicting the effects of finite tube opening time, leakage from the tube ends, and viscosity. The relative simplicity of the model makes it useful for design, optimization, and analysis of wave rotor cycles for any application. This paper discusses some details of the model and presents comparisons between the model and two laboratory wave rotor experiments.

  12. NREL Wind Integrated System Design and Engineering Model

    SciTech Connect

    Ning, S. Andrew; Scott, George; Graf, Peter

    2013-09-30

    NREL_WISDEM is an integrated model for wind turbines and plants developed In python based on the open source software OpenMDAO. NREL_WISDEM is a set of wrappers for various wind turbine and models that integrate pre-existing models together into OpenMDAO. It is organized into groups each with their own repositories including Plant_CostSE. Plant_EnergySE, Turbine_CostSE and TurbineSE. The wrappers are designed for licensed and non-licensed models though in both cases, one has to have access to and install the individual models themselves before using them in the overall software platform.

  13. Optimizing experimental design for comparing models of brain function.

    PubMed

    Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas

    2011-11-01

    This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485

  14. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  15. Design of Experiments, Model Calibration and Data Assimilation

    SciTech Connect

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of emulation, calibration and experiment design for computer experiments. Emulation refers to building a statistical surrogate from a carefully selected and limited set of model runs to predict unsampled outputs. The standard kriging approach to emulation of complex computer models is presented. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Markov chain Monte Carlo (MCMC) algorithms are often used to sample the calibrated parameter distribution. Several MCMC algorithms commonly employed in practice are presented, along with a popular diagnostic for evaluating chain behavior. Space-filling approaches to experiment design for selecting model runs to build effective emulators are discussed, including Latin Hypercube Design and extensions based on orthogonal array skeleton designs and imposed symmetry requirements. Optimization criteria that further enforce space-filling, possibly in projections of the input space, are mentioned. Designs to screen for important input variations are summarized and used for variable selection in a nuclear fuels performance application. This is followed by illustration of sequential experiment design strategies for optimization, global prediction, and rare event inference.

  16. An analytical model of axial compressor off-design performance

    SciTech Connect

    Camp, T.R.; Horlock, J.H. . Whittle Lab.)

    1994-07-01

    An analysis is presented of the off-design performance of multistage axial-flow compressors. It is based on an analytical solution, valid for small perturbations in operating conditions from the design point, and provides an insight into the effects of choices made during the compressor design process on performance and off-design stage matching. It is shown that the mean design value of stage loading coefficient ([psi] = [Delta]h[sub 0]/U[sup 2]) has a dominant effect on off-design performance, whereas the stage-wise distribution of stage loading coefficient and the design value of flow coefficient have little influence. The powerful effects of variable stator vanes on stage-matching are also demonstrated and these results are shown to agree well with previous work. The slope of the working line of a gas turbine engine, overlaid on overall compressor characteristics, is shown to have a strong effect on the off-design stage-matching through the compressor. The model is also used to analyze design changes to the compressor geometry and to show how errors in estimates of annulus blockage, decided during the design process, have less effect on compressor performance than has previously been thought.

  17. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  18. Design of a space shuttle structural dynamics model

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A 1/8 scale structural dynamics model of a parallel burn space shuttle has been designed. Basic objectives were to represent the significant low frequency structural dynamic characteristics while keeping the fabrication costs low. The model was derived from the proposed Grumman Design 619 space shuttle. The design includes an orbiter, two solid rocket motors (SRM) and an external tank (ET). The ET consists of a monocoque LO2 tank an interbank skirt with three frames to accept SRM attachment members, an LH2 tank with 10 frames of which 3 provide for orbiter attachment members, and an aft skirt with on frame to provide for aft SRM attachment members. The frames designed for the SRM attachments are fitted with transverse struts to take symmetric loads.

  19. Modelling and design of high performance indium phosphide solar cells

    NASA Technical Reports Server (NTRS)

    Rhoads, Sandra L.; Barnett, Allen M.

    1989-01-01

    A first principles pn junction device model has predicted new designs for high voltage, high efficiency InP solar cells. Measured InP material properties were applied and device parameters (thicknesses and doping) were adjusted to obtain optimal performance designs. Results indicate that p/n InP designs will provide higher voltages and higher energy conversion efficiencies than n/p structures. Improvements to n/p structures for increased efficiency are predicted. These new designs exploit the high absorption capabilities, relatively long diffusion lengths, and modest surface recombination velocities characteristic of InP. Predictions of performance indicate achievable open-circuit voltage values as high as 943 mV for InP and a practical maximum AM0 efficiency of 22.5 percent at 1 sun and 27 C. The details of the model, the optimal InP structure and the effect of individual parameter variations on device performance are presented.

  20. Groundwater modeling in RCRA assessment, corrective action design and evaluation

    SciTech Connect

    Rybak, I.; Henley, W.

    1995-12-31

    Groundwater modeling was conducted to design, implement, modify, and terminate corrective action at several RCRA sites in EPA Region 4. Groundwater flow, contaminant transport and unsaturated zone air flow models were used depending on the complexity of the site and the corrective action objectives. Software used included Modflow, Modpath, Quickflow, Bioplume 2, and AIR3D. Site assessment data, such as aquifer properties, site description, and surface water characteristics for each facility were used in constructing the models and designing the remedial systems. Modeling, in turn, specified additional site assessment data requirements for the remedial system design. The specific purpose of computer modeling is discussed with several case studies. These consist, among others, of the following: evaluation of the mechanism of the aquifer system and selection of a cost effective remedial option, evaluation of the capture zone of a pumping system, prediction of the system performance for different and difficult hydrogeologic settings, evaluation of the system performance, and trouble-shooting for the remedial system operation. Modeling is presented as a useful tool for corrective action system design, performance, evaluation, and trouble-shooting. The case studies exemplified the integration of diverse data sources, understanding the mechanism of the aquifer system, and evaluation of the performance of alternative remediation systems in a cost-effective manner. Pollutants of concern include metals and PAHs.

  1. Designing visual displays and system models for safe reactor operations

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    The material presented in this paper is based on two studies involving the design of visual displays and the user`s prospective model of a system. The studies involve a methodology known as Neuro-Linguistic Programming and its use in expanding design choices from the operator`s perspective image. The contents of this paper focuses on the studies and how they are applicable to the safety of operating reactors.

  2. Modelling and designing digital control systems with averaged measurements

    NASA Technical Reports Server (NTRS)

    Polites, Michael E.; Beale, Guy O.

    1988-01-01

    An account is given of the control systems engineering methods applicable to the design of digital feedback controllers for aerospace deterministic systems in which the output, rather than being an instantaneous measure of the system at the sampling instants, instead represents an average measure of the system over the time interval between samples. The averaging effect can be included during the modeling of the plant, thereby obviating the iteration of design/simulation phases.

  3. PHARAO laser source flight model: Design and performances

    SciTech Connect

    Lévèque, T. Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P.; Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S.; Laurent, Ph.

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  4. PHARAO laser source flight model: Design and performances

    NASA Astrophysics Data System (ADS)

    Lévèque, T.; Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P.; Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S.; Laurent, Ph.

    2015-03-01

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  5. Structural similitude and design of scaled down laminated models

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Rezaeepazhand, J.

    1993-01-01

    The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial

  6. Probabilistic predictive modelling of carbon nanocomposites for medical implants design.

    PubMed

    Chua, Matthew; Chui, Chee-Kong

    2015-04-01

    Modelling of the mechanical properties of carbon nanocomposites based on input variables like percentage weight of Carbon Nanotubes (CNT) inclusions is important for the design of medical implants and other structural scaffolds. Current constitutive models for the mechanical properties of nanocomposites may not predict well due to differences in conditions, fabrication techniques and inconsistencies in reagents properties used across industries and laboratories. Furthermore, the mechanical properties of the designed products are not deterministic, but exist as a probabilistic range. A predictive model based on a modified probabilistic surface response algorithm is proposed in this paper to address this issue. Tensile testing of three groups of different CNT weight fractions of carbon nanocomposite samples displays scattered stress-strain curves, with the instantaneous stresses assumed to vary according to a normal distribution at a specific strain. From the probabilistic density function of the experimental data, a two factors Central Composite Design (CCD) experimental matrix based on strain and CNT weight fraction input with their corresponding stress distribution was established. Monte Carlo simulation was carried out on this design matrix to generate a predictive probabilistic polynomial equation. The equation and method was subsequently validated with more tensile experiments and Finite Element (FE) studies. The method was subsequently demonstrated in the design of an artificial tracheal implant. Our algorithm provides an effective way to accurately model the mechanical properties in implants of various compositions based on experimental data of samples. PMID:25658876

  7. Green space system design in Luoyang using Huff model

    NASA Astrophysics Data System (ADS)

    Wang, Shengnan; Li, Meng

    2008-10-01

    Green space system, as part of the urban ecological environment and urban landscape, plays a significant role in the protection of biological diversity of the urban eco-systems. During the process of rapid modernization in China, it is evident that in order to satisfy the residents' needs of entertainment and communication effectively; there should be abundant types and adequate arrangement of green space. And at the same time a comprehensive and stable hierarchical structure of green space system ought to be established. Huff Model is widely used in facility location planning and service area segmentation in business geography, and has potentials in urban facility planning and design. This paper aims to evaluate, design and optimize the urban green space in Luoyang City, Henan Province, using GIS and Huff Model. Considering the existing location, size and shape of the green space supply, the spatial distribution of residence and the urban transportation systems, the attractiveness between residence and green space is estimated. The spatial pattern and service capability of the green space system are also evaluated critically. Based on the findings, the possible optimization design of the green space system in Luoyang is discussed innovatively. Huff model test shows that the design improves the overall spatial accessibility observably. The case study shows that GIS technology and Huff Model have great potential in urban green space evaluation, planning and design.

  8. FEM numerical model study of electrosurgical dispersive electrode design parameters.

    PubMed

    Pearce, John A

    2015-08-01

    Electrosurgical dispersive electrodes must safely carry the surgical current in monopolar procedures, such as those used in cutting, coagulation and radio frequency ablation (RFA). Of these, RFA represents the most stringent design constraint since ablation currents are often more than 1 to 2 Arms (continuous) for several minutes depending on the size of the lesion desired and local heat transfer conditions at the applicator electrode. This stands in contrast to standard surgical activations, which are intermittent, and usually less than 1 Arms, but for several seconds at a time. Dispersive electrode temperature rise is also critically determined by the sub-surface skin anatomy, thicknesses of the subcutaneous and supra-muscular fat, etc. Currently, we lack fundamental engineering design criteria that provide an estimating framework for preliminary designs of these electrodes. The lack of a fundamental design framework means that a large number of experiments must be conducted in order to establish a reasonable design. Previously, an attempt to correlate maximum temperatures in experimental work with the average current density-time product failed to yield a good match. This paper develops and applies a new measure of an electrode stress parameter that correlates well with both the previous experimental data and with numerical models of other electrode shapes. The finite element method (FEM) model work was calibrated against experimental RF lesions in porcine skin to establish the fundamental principle underlying dispersive electrode performance. The results can be used in preliminary electrode design calculations, experiment series design and performance evaluation. PMID:26736814

  9. Design verification and cold-flow modeling test report

    SciTech Connect

    Not Available

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  10. An economic model for passive solar designs in commercial environments

    NASA Astrophysics Data System (ADS)

    Powell, J. W.

    1980-06-01

    The model incorporates a life cycle costing approach that focuses on the costs of purchase, installation, maintenance, repairs, replacement, and energy. It includes a detailed analysis of tax laws affecting the use of solar energy in commercial buildings. Possible methods of treating difficult to measure benefits and costs, such as effects of the passive solar design on resale value of the building and on lighting costs, rental income from the building, and the use of commercial space, are presented. The model is illustrated in two case examples of prototypical solar design for low rise commercial buildings in an urban setting.

  11. Design driven test patterns for OPC models calibration

    NASA Astrophysics Data System (ADS)

    Al-Imam, Mohamed

    2009-03-01

    In the modern photolithography process for manufacturing integrated circuits, geometry dimensions need to be realized on silicon which are much smaller than the exposure wavelength. Thus Resolution Enhancement Techniques have an indispensable role towards the implementation of a successful technology process node. Finding an appropriate RET recipe, that answers the needs of a certain fabrication process, usually involves intensive computational simulations. These simulations have to reflect how different elements in the lithography process under study will behave. In order to achieve this, accurate models are needed that truly represent the transmission of patterns from mask to silicon. A common practice in calibrating lithography models is to collect data for the dimensions of some test structures created on the exposure mask along with the corresponding dimensions of these test structures on silicon after exposure. This data is used to tune the models for good predictions. The models will be guaranteed to accurately predict the test structures that has been used in its tuning. However, real designs might have a much greater variety of structures that might not have been included in the test structures. This paper explores a method for compiling the test structures to be used in the model calibration process using design layouts as an input. The method relies on reducing structures in the design layout to the essential unique structure from the lithography models point of view, and thus ensuring that the test structures represent what the model would actually have to predict during the simulations.

  12. Electricity Market Manipulation: How Behavioral Modeling Can Help Market Design

    SciTech Connect

    Gallo, Giulia

    2015-12-18

    The question of how to best design electricity markets to integrate variable and uncertain renewable energy resources is becoming increasingly important as more renewable energy is added to electric power systems. Current markets were designed based on a set of assumptions that are not always valid in scenarios of high penetrations of renewables. In a future where renewables might have a larger impact on market mechanisms as well as financial outcomes, there is a need for modeling tools and power system modeling software that can provide policy makers and industry actors with more realistic representations of wholesale markets. One option includes using agent-based modeling frameworks. This paper discusses how key elements of current and future wholesale power markets can be modeled using an agent-based approach and how this approach may become a useful paradigm that researchers can employ when studying and planning for power systems of the future.

  13. Software Engineering Designs for Super-Modeling Different Versions of CESM Models using DART

    NASA Astrophysics Data System (ADS)

    Kluzek, E. B.; Duane, G. S.; Tribbia, J. J.; Vertenstein, M.

    2013-12-01

    The super-modeling approach connects different models together at run time in order to provide run time feedbacks between the models and therefore synchronize the models together. This method thus reduces model bias further than after-the-fact averaging of model output. We explore different designs to connect different configurations and versions of the IPCC class climate model the Community Earth System Model (CESM) together. We use the Data Assimilation Research Test-bed (DART) software to provide data assimilation as well as provide a software framework to link different model configurations together. We also show results from some simple experiments that demonstrate the ability to synchronize different model versions together.

  14. Model-It: A Case Study of Learner-Centered Software Design for Supporting Model Building.

    ERIC Educational Resources Information Center

    Jackson, Shari L.; Stratford, Steven J.; Krajcik, Joseph S.; Soloway, Elliot

    Learner-centered software design (LCSD) guides the design of tasks, tools, and interfaces in order to support the unique needs of learners: growth, diversity and motivation. This paper presents a framework for LCSD and describes a case study of its application to the ScienceWare Model-It, a learner-centered tool to support scientific modeling and…

  15. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non

  16. Planetary gear profile modification design based on load sharing modelling

    NASA Astrophysics Data System (ADS)

    Iglesias, Miguel; Fernández Del Rincón, Alfonso; De-Juan, Ana Magdalena; Garcia, Pablo; Diez, Alberto; Viadero, Fernando

    2015-07-01

    In order to satisfy the increasing demand on high performance planetary transmissions, an important line of research is focused on the understanding of some of the underlying phenomena involved in this mechanical system. Through the development of models capable of reproduce the system behavior, research in this area contributes to improve gear transmission insight, helping developing better maintenance practices and more efficient design processes. A planetary gear model used for the design of profile modifications ratio based on the levelling of the load sharing ratio is presented. The gear profile geometry definition, following a vectorial approach that mimics the real cutting process of gears, is thoroughly described. Teeth undercutting and hypotrochoid definition are implicitly considered, and a procedure for the incorporation of a rounding arc at the tooth tip in order to deal with corner contacts is described. A procedure for the modeling of profile deviations is presented, which can be used for the introduction of both manufacturing errors and designed profile modifications. An easy and flexible implementation of the profile deviation within the planetary model is accomplished based on the geometric overlapping. The contact force calculation and dynamic implementation used in the model are also introduced, and parameters from a real transmission for agricultural applications are presented for the application example. A set of reliefs is designed based on the levelling of the load sharing ratio for the example transmission, and finally some other important dynamic factors of the transmission are analyzed to assess the changes in the dynamic behavior with respect to the non-modified case. Thus, the main innovative aspect of the proposed planetary transmission model is the capacity of providing a simulated load sharing ratio which serves as design variable for the calculation of the tooth profile modifications.

  17. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  18. Test model designs for advanced refractory ceramic materials

    NASA Technical Reports Server (NTRS)

    Tran, Huy Kim

    1993-01-01

    The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.

  19. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  20. Simulation models and designs for advanced Fischer-Tropsch technology

    SciTech Connect

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for the products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.

  1. PID controller design for trailer suspension based on linear model

    NASA Astrophysics Data System (ADS)

    Kushairi, S.; Omar, A. R.; Schmidt, R.; Isa, A. A. Mat; Hudha, K.; Azizan, M. A.

    2015-05-01

    A quarter of an active trailer suspension system having the characteristics of a double wishbone type was modeled as a complex multi-body dynamic system in MSC.ADAMS. Due to the complexity of the model, a linearized version is considered in this paper. A model reduction technique is applied to the linear model, resulting in a reduced-order model. Based on this simplified model, a Proportional-Integral-Derivative (PID) controller was designed in MATLAB/Simulink environment; primarily to reduce excessive roll motions and thus improving the ride comfort. Simulation results show that the output signal closely imitates the input signal in multiple cases - demonstrating the effectiveness of the controller.

  2. A simplified model for two phase face seal design

    NASA Technical Reports Server (NTRS)

    Lau, S. Y.; Hughes, W. F.; Basu, P.; Beatty, P. A.

    1990-01-01

    A simplified quasi-isothermal low-leakage laminar model for analyzing the stiffness and the stability characteristics of two-phase face seals with real fluids is developed. Sample calculations with this model for low-leakage operations are compared with calculations for high-leakage operations, performed using the adiabatic turbulent model of Beatty and Hughes (1987). It was found that the seal characteristics predicted using the two extreme models tend to overlap with each other, indicating that the simplified laminar model may be a useful tool for seal design. The effect of coning was investigated using the simplified model. The results show that, for the same balance, a coned seal has a higher leakage rate than a parallel face seal.

  3. A novel observer design method for neural mass models

    NASA Astrophysics Data System (ADS)

    Liu, Xian; Miao, Dong-Kai; Gao, Qing; Xu, Shi-Yun

    2015-09-01

    Neural mass models can simulate the generation of electroencephalography (EEG) signals with different rhythms, and therefore the observation of the states of these models plays a significant role in brain research. The structure of neural mass models is special in that they can be expressed as Lurie systems. The developed techniques in Lurie system theory are applicable to these models. We here provide a new observer design method for neural mass models by transforming these models and the corresponding error systems into nonlinear systems with Lurie form. The purpose is to establish appropriate conditions which ensure the convergence of the estimation error. The effectiveness of the proposed method is illustrated by numerical simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 61473245, 61004050, and 51207144).

  4. Designing a Programming-Based Approach for Modelling Scientific Phenomena

    ERIC Educational Resources Information Center

    Simpson, Gordon; Hoyles, Celia; Noss, Richard

    2005-01-01

    We describe an iteratively designed sequence of activities involving the modelling of one-dimensional collisions between moving objects based on programming in ToonTalk. Students aged 13-14 years in two settings (London and Cyprus) investigated a number of collision situations, classified into six classes based on the relative velocities and…

  5. Design Approaches to Support Preservice Teachers in Scientific Modeling

    ERIC Educational Resources Information Center

    Kenyon, Lisa; Davis, Elizabeth A.; Hug, Barbara

    2011-01-01

    Engaging children in scientific practices is hard for beginning teachers. One such scientific practice with which beginning teachers may have limited experience is scientific modeling. We have iteratively designed preservice teacher learning experiences and materials intended to help teachers achieve learning goals associated with scientific…

  6. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  7. Universal Instructional Design as a Model for Educational Programs

    ERIC Educational Resources Information Center

    Higbee, Jeanne L.

    2007-01-01

    This article describes Universal Instructional Design as an inclusive pedagogical model for use in educational programs, whether provided by traditional educational institutions, community-based initiatives, or workplace literacy projects. For the benefit of public relations specialists and classroom educators alike, the article begins with a…

  8. Conflicts Management Model in School: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…

  9. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  10. Controller design via structural reduced modeling by FETM

    NASA Technical Reports Server (NTRS)

    Yousuff, A.

    1986-01-01

    The Finite Element - Transfer Matrix (FETM) method has been developed to reduce the computations involved in analysis of structures. This widely accepted method, however, has certain limitations, and does not directly produce reduced models for control design. To overcome these shortcomings, a modification of FETM method has been developed. The modified FETM method easily produces reduced models that are tailored toward subsequent control design. Other features of this method are its ability to: (1) extract open loop frequencies and mode shapes with less computations, (2) overcome limitations of the original FETM method, and (3) simplify the procedures for output feedback, constrained compensation, and decentralized control. This semi annual report presents the development of the modified FETM, and through an example, illustrates its applicability to an output feedback and a decentralized control design.