Science.gov

Sample records for community-driven model designed

  1. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from

  2. More than Just Test Scores: Leading for Improvement with an Alternative Community-Driven Accountability Metric

    ERIC Educational Resources Information Center

    Spain, Angeline; McMahon, Kelly

    2016-01-01

    In this case, Sharon Rowley, a veteran principal, volunteers to participate in a new community-driven accountability initiative and encounters dilemmas about what it means to be a "data-driven" instructional leader. This case provides an opportunity for aspiring school leaders to explore and apply data-use theory to the work of leading…

  3. Community-driven research on environmental sources of H. pylori infection in arctic Canada

    PubMed Central

    Hastings, Emily V; Yasui, Yutaka; Hanington, Patrick; Goodman, Karen J; Working Group, The CANHelp

    2014-01-01

    The role of environmental reservoirs in H. pylori transmission remains uncertain due to technical difficulties in detecting living organisms in sources outside the stomach. Residents of some Canadian Arctic communities worry that contamination of the natural environment is responsible for the high prevalence of H. pylori infection in the region. This analysis aims to estimate associations between exposure to potential environmental sources of biological contamination and prevalence of H. pylori infection in Arctic Canada. Using data from 3 community-driven H. pylori projects in the Northwest and Yukon Territories, we estimated effects of environmental exposures on H. pylori prevalence, using odds ratios (OR) and 95% confidence intervals (CI) from multilevel logistic regression models to adjust for household and community effects. Investigated exposures include: untreated drinking water; livestock; dogs; cats; mice or mouse droppings in the home; cleaning fish or game. Our analysis did not identify environmental exposures associated clearly with increased H. pylori prevalence, except any exposure to mice or mouse droppings (OR = 4.6, CI = 1.2–18), reported by 11% of participants. Our multilevel models showed H. pylori clustering within households, but environmental exposures accounted for little of this clustering; instead, much of it was accounted for by household composition (especially: having infected household members; number of children). Like the scientific literature on this topic, our results do not clearly implicate or rule out environmental reservoirs of H. pylori; thus, the topic remains a priority for future research. Meanwhile, H. pylori prevention research should seek strategies for reducing direct transmission from person to person. PMID:25483330

  4. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  5. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  6. Fracture design modelling

    SciTech Connect

    Crichlow, H.B.; Crichlow, H.B.

    1980-02-07

    A design tool is discussed whereby the various components that enter the design process of a hydraulic fracturing job are combined to provide a realistic appraisal of a stimulation job in the field. An interactive computer model is used to solve the problem numerically to obtain the effects of various parameters on the overall behavior of the system.

  7. Ligand modeling and design

    SciTech Connect

    Hay, B.P.

    1997-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used in the cost-effective removal of specific radionuclides from nuclear waste streams. Organic ligands with metal ion specificity are critical components in the development of solvent extraction and ion exchange processes that are highly selective for targeted radionuclides. The traditional approach to the development of such ligands involves lengthy programs of organic synthesis and testing, which in the absence of reliable methods for screening compounds before synthesis, results in wasted research effort. The author`s approach breaks down and simplifies this costly process with the aid of computer-based molecular modeling techniques. Commercial software for organic molecular modeling is being configured to examine the interactions between organic ligands and metal ions, yielding an inexpensive, commercially or readily available computational tool that can be used to predict the structures and energies of ligand-metal complexes. Users will be able to correlate the large body of existing experimental data on structure, solution binding affinity, and metal ion selectivity to develop structural design criteria. These criteria will provide a basis for selecting ligands that can be implemented in separations technologies through collaboration with other DOE national laboratories and private industry. The initial focus will be to select ether-based ligands that can be applied to the recovery and concentration of the alkali and alkaline earth metal ions including cesium, strontium, and radium.

  8. The CECAM Electronic Structure Library: community-driven development of software libraries for electronic structure simulations

    NASA Astrophysics Data System (ADS)

    Oliveira, Micael

    The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.

  9. Student Models of Instructional Design

    ERIC Educational Resources Information Center

    Magliaro, Susan G.; Shambaugh, Neal

    2006-01-01

    Mental models are one way that humans represent knowledge (Markman, 1999). Instructional design (ID) is a conceptual model for developing instruction and typically includes analysis, design, development, implementation, and evaluation (i.e., ADDIE model). ID, however, has been viewed differently by practicing teachers and instructional designers…

  10. Ligand modeling and design

    SciTech Connect

    Hay, B.

    1996-10-01

    The purpose of this work is to develop and implement a molecular design basis for selecting organic ligands that would be used tin applications for the cost-effective removal of specific radionuclides from nuclear waste streams.

  11. A Community-Driven Intervention in Tuftonboro, New Hampshire, Succeeds in Altering Water Testing Behavior.

    PubMed

    Paul, Michael P; Rigrod, Pierce; Wingate, Steve; Borsuk, Mark E

    2015-12-01

    Maximum contaminant levels created by the U.S. Environmental Protection Agency under the Safe Drinking Water Act do not apply to private wells. Rather, the onus is on individual households to undertake regular water testing. Several barriers exist to testing and treating water from private wells, including a lack of awareness about both well water as a potential source of contaminants and government-recommended water testing schedules; a health literacy level that may not be sufficient to interpret complex environmental health messages; the inconvenience of water testing; the financial costs of testing and treatment; and a myriad of available treatment options. The existence of these barriers is problematic because well water can be a source of hazardous contaminants. This article describes an initiative--undertaken by the Tuftonboro (New Hampshire) Conservation Commission, with support from state agencies and a research program at Dartmouth College--to increase water testing rates in a rural region with a relatively high number of wells. The project prompted more water tests at the state laboratory in one day than in the prior six years. This suggests that community-driven, collaborative efforts to overcome practical barriers could be successful at raising testing rates and ultimately improving public health.

  12. A Community-Driven Intervention in Tuftonboro, New Hampshire, Succeeds in Altering Water Testing Behavior

    PubMed Central

    Paul, Michael P.; Rigrod, Pierce; Wingate, Steve; Borsuk, Mark E.

    2016-01-01

    Maximum contaminant levels created by the U.S. Environmental Protection Agency under the Safe Drinking Water Act do not apply to private wells. Rather, the onus is on individual households to undertake regular water testing. Several barriers exist to testing and treating water from private wells, including a lack of awareness about both well water as a potential source of contaminants and government-recommended water testing schedules; a health literacy level that may not be sufficient to interpret complex environmental health messages; the inconvenience of water testing; the financial costs of testing and treatment; and a myriad of available treatment options. The existence of these barriers is problematic because well water can be a source of hazardous contaminants. This article describes an initiative—undertaken by the Tuftonboro (New Hampshire) Conservation Commission, with support from state agencies and a research program at Dartmouth College—to increase water testing rates in a rural region with a relatively high number of wells. The project prompted more water tests at the state laboratory in one day than in the prior six years. This suggests that community-driven, collaborative efforts to overcome practical barriers could be successful at raising testing rates and ultimately improving public health. PMID:26738316

  13. A Community-Driven Intervention in Tuftonboro, New Hampshire, Succeeds in Altering Water Testing Behavior.

    PubMed

    Paul, Michael P; Rigrod, Pierce; Wingate, Steve; Borsuk, Mark E

    2015-12-01

    Maximum contaminant levels created by the U.S. Environmental Protection Agency under the Safe Drinking Water Act do not apply to private wells. Rather, the onus is on individual households to undertake regular water testing. Several barriers exist to testing and treating water from private wells, including a lack of awareness about both well water as a potential source of contaminants and government-recommended water testing schedules; a health literacy level that may not be sufficient to interpret complex environmental health messages; the inconvenience of water testing; the financial costs of testing and treatment; and a myriad of available treatment options. The existence of these barriers is problematic because well water can be a source of hazardous contaminants. This article describes an initiative--undertaken by the Tuftonboro (New Hampshire) Conservation Commission, with support from state agencies and a research program at Dartmouth College--to increase water testing rates in a rural region with a relatively high number of wells. The project prompted more water tests at the state laboratory in one day than in the prior six years. This suggests that community-driven, collaborative efforts to overcome practical barriers could be successful at raising testing rates and ultimately improving public health. PMID:26738316

  14. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  15. Family and community driven response to intimate partner violence in post-conflict settings.

    PubMed

    Kohli, Anjalee; Perrin, Nancy; Mpanano, Remy Mitima; Banywesize, Luhazi; Mirindi, Alfred Bacikenge; Banywesize, Jean Heri; Mitima, Clovis Murhula; Binkurhorhwa, Arsène Kajabika; Bufole, Nadine Mwinja; Glass, Nancy

    2015-12-01

    This study explores risk factors, individual and family consequences and community-driven responses to intimate partner violence (IPV) in post-conflict eastern Democratic Republic of Congo (DRC). This qualitative study was conducted in 3 rural villages in South Kivu Province of DRC, an area that has experienced prolonged conflict. Participants included 13 female survivors and 5 male perpetrators of IPV as reported during baseline data collection for the parent study, an impact evaluation of the Congolese-led livestock microfinance program, Pigs for Peace. Participants described social and behavioral circumstances that increase risk for IPV; social, health and economic consequences on women and their families; and resources to protect women and their families. Social and behavioral factors reported by survivors and perpetrators indicate that IPV was linked to husband's alcohol consumption, household economic instability, male desire to maintain his position as head of family and perceived disrespect of husband by wife. In addition to well-known health consequences of IPV, women reported negative social consequences, such as stigma, resulting in barriers for the well-being of the family. Survivors and perpetrators described the impact of IPV on their children, specifically the lack of proper parental guidance and lack of safety and stability that could result in the child(ren) misbehaving and using violence in their relationships resulting in further stigma towards the child and family. Strategies employed by survivors to protect themselves and family, include placating male behaviors (e.g., not responding to insults, trying to meet household demands). Perpetrators that tried to reduce the impact of IPV reported a preference for social and financial control of their partner rather than physical violence, believing this to be less severe. Participants described community and family based social support systems including couple's mediation, responsible partner and

  16. Family and community driven response to intimate partner violence in post-conflict settings.

    PubMed

    Kohli, Anjalee; Perrin, Nancy; Mpanano, Remy Mitima; Banywesize, Luhazi; Mirindi, Alfred Bacikenge; Banywesize, Jean Heri; Mitima, Clovis Murhula; Binkurhorhwa, Arsène Kajabika; Bufole, Nadine Mwinja; Glass, Nancy

    2015-12-01

    This study explores risk factors, individual and family consequences and community-driven responses to intimate partner violence (IPV) in post-conflict eastern Democratic Republic of Congo (DRC). This qualitative study was conducted in 3 rural villages in South Kivu Province of DRC, an area that has experienced prolonged conflict. Participants included 13 female survivors and 5 male perpetrators of IPV as reported during baseline data collection for the parent study, an impact evaluation of the Congolese-led livestock microfinance program, Pigs for Peace. Participants described social and behavioral circumstances that increase risk for IPV; social, health and economic consequences on women and their families; and resources to protect women and their families. Social and behavioral factors reported by survivors and perpetrators indicate that IPV was linked to husband's alcohol consumption, household economic instability, male desire to maintain his position as head of family and perceived disrespect of husband by wife. In addition to well-known health consequences of IPV, women reported negative social consequences, such as stigma, resulting in barriers for the well-being of the family. Survivors and perpetrators described the impact of IPV on their children, specifically the lack of proper parental guidance and lack of safety and stability that could result in the child(ren) misbehaving and using violence in their relationships resulting in further stigma towards the child and family. Strategies employed by survivors to protect themselves and family, include placating male behaviors (e.g., not responding to insults, trying to meet household demands). Perpetrators that tried to reduce the impact of IPV reported a preference for social and financial control of their partner rather than physical violence, believing this to be less severe. Participants described community and family based social support systems including couple's mediation, responsible partner and

  17. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  18. Electromagnetic modeling in accelerator designs

    SciTech Connect

    Cooper, R.K.; Chan, K.C.D.

    1990-01-01

    Through the years, electromagnetic modeling using computers has proved to be a cost-effective tool for accelerator designs. Traditionally, electromagnetic modeling of accelerators has been limited to resonator and magnet designs in two dimensions. In recent years with the availability of powerful computers, electromagnetic modeling of accelerators has advanced significantly. Through the above conferences, it is apparent that breakthroughs have been made during the last decade in two important areas: three-dimensional modeling and time-domain simulation. Success in both these areas have been made possible by the increasing size and speed of computers. In this paper, the advances in these two areas will be described.

  19. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  20. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  1. Protein designs in HP models

    NASA Astrophysics Data System (ADS)

    Gupta, Arvind; Khodabakhshi, Alireza Hadj; Maňuch, Ján; Rafiey, Arash; Stacho, Ladislav

    2009-07-01

    The inverse protein folding problem is that of designing an amino acid sequence which folds into a prescribed shape. This problem arises in drug design where a particular structure is necessary to ensure proper protein-protein interactions and could have applications in nanotechnology. A major challenge in designing proteins with native folds that attain a specific shape is to avoid proteins that have multiple native folds (unstable proteins). In this technical note we present our results on protein designs in the variant of Hydrophobic-Polar (HP) model introduced by Dill [6] on 2D square lattice. The HP model distinguishes only polar and hydrophobic monomers and only counts the number of hydrophobic contacts in the energy function. To achieve better stability of our designs we use the Hydrophobic-Polar-Cysteine (HPC) model which distinguishes the third type of monomers called "cysteines" and incorporates also the disulfid bridges (SS-bridges) into the energy function. We present stable designs in 2D square lattice and 3D hexagonal prism lattice in the HPC model.

  2. Generic Model Host System Design

    SciTech Connect

    Chu, Chungming; Wu, Juhao; Qiang, Ji; Shen, Guobao; /Brookhaven

    2012-06-22

    There are many simulation codes for accelerator modelling; each one has some strength but not all. A platform which can host multiple modelling tools would be ideal for various purposes. The model platform along with infrastructure support can be used not only for online applications but also for offline purposes. Collaboration is formed for the effort of providing such a platform. In order to achieve such a platform, a set of common physics data structure has to be set. Application Programming Interface (API) for physics applications should also be defined within a model data provider. A preliminary platform design and prototype is discussed.

  3. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  4. Innovative and Community-Driven Communication Practices of the South Carolina Cancer Prevention and Control Research Network

    PubMed Central

    Brandt, Heather M.; Freedman, Darcy A.; Adams, Swann Arp; Young, Vicki M.; Ureda, John R.; McCracken, James Lyndon; Hébert, James R.

    2014-01-01

    The South Carolina Cancer Prevention and Control Research Network (SC-CPCRN) is 1 of 10 networks funded by the Centers for Disease Control and Prevention and the National Cancer Institute (NCI) that works to reduce cancer-related health disparities. In partnership with federally qualified health centers and community stakeholders, the SC-CPCRN uses evidence-based approaches (eg, NCI Research-tested Intervention Programs) to disseminate and implement cancer prevention and control messages, programs, and interventions. We describe the innovative stakeholder- and community-driven communication efforts conducted by the SC-CPCRN to improve overall health and reduce cancer-related health disparities among high-risk and disparate populations in South Carolina. We describe how our communication efforts are aligned with 5 core values recommended for dissemination and implementation science: 1) rigor and relevance, 2) efficiency and speed, 3) collaboration, 4) improved capacity, and 5) cumulative knowledge. PMID:25058673

  5. Innovative and community-driven communication practices of the South Carolina cancer prevention and control research network.

    PubMed

    Friedman, Daniela B; Brandt, Heather M; Freedman, Darcy A; Adams, Swann Arp; Young, Vicki M; Ureda, John R; McCracken, James Lyndon; Hébert, James R

    2014-07-24

    The South Carolina Cancer Prevention and Control Research Network (SC-CPCRN) is 1 of 10 networks funded by the Centers for Disease Control and Prevention and the National Cancer Institute (NCI) that works to reduce cancer-related health disparities. In partnership with federally qualified health centers and community stakeholders, the SC-CPCRN uses evidence-based approaches (eg, NCI Research-tested Intervention Programs) to disseminate and implement cancer prevention and control messages, programs, and interventions. We describe the innovative stakeholder- and community-driven communication efforts conducted by the SC-CPCRN to improve overall health and reduce cancer-related health disparities among high-risk and disparate populations in South Carolina. We describe how our communication efforts are aligned with 5 core values recommended for dissemination and implementation science: 1) rigor and relevance, 2) efficiency and speed, 3) collaboration, 4) improved capacity, and 5) cumulative knowledge.

  6. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    NASA Astrophysics Data System (ADS)

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    Advances in ecology and environmental science increasingly depend on information from multiple disciplines to tackle broader and more complex questions about the natural world. Such advances, however, are hindered by data heterogeneity, which impedes the ability of researchers to discover, interpret, and integrate relevant data that have been collected by others. Here, we outline two community-building initiatives for improving data interoperability in the ecological and environmental sciences, one that is well-established (the Ecological Metadata Language [EML]), and another that is actively underway (a unified model for observations and measurements). EML is a metadata specification developed for the ecology discipline, and is based on prior work done by the Ecological Society of America and associated efforts to ensure a modular and extensible framework to document ecological data. EML "modules" are designed to describe one logical part of the total metadata that should be included with any ecological dataset. EML was developed through a series of working meetings, ongoing discussion forums and email lists, with participation from a broad range of ecological and environmental scientists, as well as computer scientists and software developers. Where possible, EML adopted syntax from the other metadata standards for other disciplines (e.g., Dublin Core, Content Standard for Digital Geospatial Metadata, and more). Although EML has not yet been ratified through a standards body, it has become the de facto metadata standard for a large range of ecological data management projects, including for the Long Term Ecological Research Network, the National Center for Ecological Analysis and Synthesis, and the Ecological Society of America. The second community-building initiative is based on work through the Scientific Environment for Ecological Knowledge (SEEK) as well as a recent workshop on multi-disciplinary data management. This initiative aims at improving

  7. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  8. Evaluation of community-driven smallholder irrigation in dryland South Pare Mountains, Tanzania: A case study of Manoo micro dam

    NASA Astrophysics Data System (ADS)

    Makurira, H.; Mul, M. L.; Vyagusa, N. F.; Uhlenbrook, S.; Savenije, H. H. G.

    Water is the main limiting factor for crop production in semi-arid sub-Saharan Africa. This paper presents an evaluation of the effectiveness of community-driven smallholder irrigation schemes using micro dams under current operational practices. The research site is the semi-arid Vudee sub-catchment within the Makanya Catchment, which is part of the Pangani River Basin (Northern Tanzania). A micro dam is presented as a case study. Micro dams are popular in the study area they have water sharing system between upstream and downstream users put in place with minimum input from external agencies. The effectiveness of micro dams on dry spell mitigation is investigated. The significance of dam size, total water diverted per season, system losses and approximate amounts of water received by each farmer in a given season is analysed. Local smallholder farmers have put up the micro dams to address their need for extra water for agriculture. The capacities of the micro dams are very small but without them there is insufficient water to allocate to at least one irrigation event per farmer in a season, the dams serve a useful purpose when operated as night storage reservoirs. The study found out that the micro dam system, under current operational rules, is inefficient as the high system losses put to question the wisdom of irrigating over scattered sites as opposed to one common irrigation plot near the dam site where each participant would be allocated a small piece of land to irrigate.

  9. High and equitable tuberculosis awareness coverage in the community-driven Axshya TB control project in India.

    PubMed

    Thapa, B; Chadha, S S; Das, A; Mohanty, S; Tonsing, J

    2015-03-21

    Data from surveys on knowledge, attitudes and practice (KAP) on tuberculosis (TB) conducted under the Axshya project at two time points (baseline 2010-2011 and mid-line 2012-2013) were analysed for changes in coverage and equity of TB awareness after project interventions. Overall coverage increased from 84% at baseline to 88% at midline (5% increase, P < 0.05). In comparison to baseline results, coverage at the midline survey had significantly increased, from 81% to 87% among the rural population, from 81% to 86% among women, from 73% to 85% in the ⩾55 years age group, from 71% to 80% among illiterates and from 73% to 81% in the south zone (P < 0.05). The equity gap among the different study groups (settlement, sex, age, education and zones) decreased from 6-23% at baseline to 3-11% during the midline survey. The maximum decline was observed for type of settlement (rural vs. urban), from 10% to 3% (P < 0.05). This community-driven TB control project has achieved high and equitable coverage of TB awareness, offering valuable lessons for the global community. PMID:26400604

  10. Course Design Using an Authentic Studio Model

    ERIC Educational Resources Information Center

    Wilson, Jay R.

    2013-01-01

    Educational Technology and Design 879 is a graduate course that introduces students to the basics of video design and production. In an attempt to improve the learning experience for students a redesign of the course was implemented for the summer of 2011 that incorporated an authentic design studio model. The design studio approach is based on…

  11. Improving Health with Science: Exploring Community-Driven Science Education in Kenya

    NASA Astrophysics Data System (ADS)

    Leak, Anne Emerson

    This study examines the role of place-based science education in fostering student-driven health interventions. While literature shows the need to connect science with students' place and community, there is limited understanding of strategies for doing so. Making such connections is important for underrepresented students who tend to perceive learning science in school as disconnected to their experiences out of school (Aikenhead, Calabrese-Barton, & Chinn, 2006). To better understand how students can learn to connect place and community with science and engineering practices in a village in Kenya, I worked with community leaders, teachers, and students to develop and study an education program (a school-based health club) with the goal of improving knowledge of health and sanitation in a Kenyan village. While students selected the health topics and problems they hoped to address through participating in the club, the topics were taught with a focus on providing opportunities for students to learn the practices of science and health applications of these practices. Students learned chemistry, physics, environmental science, and engineering to help them address the health problems they had identified in their community. Surveys, student artifacts, ethnographic field notes, and interview data from six months of field research were used to examine the following questions: (1) In what ways were learning opportunities planned for using science and engineering practices to improve community health? (2) In what ways did students apply science and engineering practices and knowledge learned from the health club in their school, homes, and community? and (3) What factors seemed to influence whether students applied or intended to apply what they learned in the health club? Drawing on place-based science education theory and community-engagement models of health, process and structural coding (Saldana, 2013) were used to determine patterns in students' applications of their

  12. Instructional Design Models: What a Revolution!

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2007-01-01

    This review examines instructional design models and the construction of knowledge. It further explores to identify the chilling benefits of these models for the inputs and outputs of knowledge transfer. This assessment also attempts to define instructional design models through the eyes and the minds of renowned scholars as well as the most…

  13. Designing control system information models

    NASA Technical Reports Server (NTRS)

    Panin, K. I.; Zinchenko, V. P.

    1973-01-01

    Problems encountered in modeling information models are discussed, Data cover condition, functioning of the object of control, and the environment involved in the control. Other parameters needed for the model include: (1) information for forming an image of the real situation, (2) data for analyzing and evaluating an evolving situation, (3) planning actions, and (4) data for observing and evaluating the results of model realization.

  14. Challenges in conducting community-driven research created by differing ways of talking and thinking about science: a researcher's perspective.

    PubMed

    Colquhoun, Amy; Geary, Janis; Goodman, Karen J

    2013-01-01

    Increasingly, health scientists are becoming aware that research collaborations that include community partnerships can be an effective way to broaden the scope and enhance the impact of research aimed at improving public health. Such collaborations extend the reach of academic scientists by integrating a variety of perspectives and thus strengthening the applicability of the research. Communication challenges can arise, however, when attempting to address specific research questions in these collaborations. In particular, inconsistencies can exist between scientists and community members in the use and interpretation of words and other language features, particularly when conducting research with a biomedical component. Additional challenges arise from differing perceptions of the investigative process. There may be divergent perceptions about how research questions should and can be answered, and in expectations about requirements of research institutions and research timelines. From these differences, misunderstandings can occur about how the results will ultimately impact the community. These communication issues are particularly challenging when scientists and community members are from different ethnic and linguistic backgrounds that may widen the gap between ways of talking and thinking about science, further complicating the interactions and exchanges that are essential for effective joint research efforts. Community-driven research that aims to describe the burden of disease associated with Helicobacter pylori infection is currently underway in northern Aboriginal communities located in the Yukon and Northwest Territories, Canada, with the goal of identifying effective public health strategies for reducing health risks from this infection. This research links community representatives, faculty from various disciplines at the University of Alberta, as well as territorial health care practitioners and officials. This highly collaborative work will be used to

  15. Challenges in conducting community-driven research created by differing ways of talking and thinking about science: a researcher's perspective.

    PubMed

    Colquhoun, Amy; Geary, Janis; Goodman, Karen J

    2013-01-01

    Increasingly, health scientists are becoming aware that research collaborations that include community partnerships can be an effective way to broaden the scope and enhance the impact of research aimed at improving public health. Such collaborations extend the reach of academic scientists by integrating a variety of perspectives and thus strengthening the applicability of the research. Communication challenges can arise, however, when attempting to address specific research questions in these collaborations. In particular, inconsistencies can exist between scientists and community members in the use and interpretation of words and other language features, particularly when conducting research with a biomedical component. Additional challenges arise from differing perceptions of the investigative process. There may be divergent perceptions about how research questions should and can be answered, and in expectations about requirements of research institutions and research timelines. From these differences, misunderstandings can occur about how the results will ultimately impact the community. These communication issues are particularly challenging when scientists and community members are from different ethnic and linguistic backgrounds that may widen the gap between ways of talking and thinking about science, further complicating the interactions and exchanges that are essential for effective joint research efforts. Community-driven research that aims to describe the burden of disease associated with Helicobacter pylori infection is currently underway in northern Aboriginal communities located in the Yukon and Northwest Territories, Canada, with the goal of identifying effective public health strategies for reducing health risks from this infection. This research links community representatives, faculty from various disciplines at the University of Alberta, as well as territorial health care practitioners and officials. This highly collaborative work will be used to

  16. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  17. Computer Modeling and Visualization in Design Technology: An Instructional Model.

    ERIC Educational Resources Information Center

    Guidera, Stan

    2002-01-01

    Design visualization can increase awareness of issues related to perceptual and psychological aspects of design that computer-assisted design and computer modeling may not allow. A pilot university course developed core skills in modeling and simulation using visualization. Students were consistently able to meet course objectives. (Contains 16…

  18. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  19. Space Station Freedom natural environment design models

    NASA Technical Reports Server (NTRS)

    Suggs, Robert M.

    1993-01-01

    The Space Station Freedom program has established a series of natural environment models and databases for utilization in design and operations planning activities. The suite of models and databases that have either been selected from among internationally recognized standards or developed specifically for spacecraft design applications are presented. The models have been integrated with an orbit propagator and employed to compute environmental conditions for planned operations altitudes of Space Station Freedom.

  20. Building a Mobile HIV Prevention App for Men Who Have Sex With Men: An Iterative and Community-Driven Process

    PubMed Central

    McDougal, Sarah J; Sullivan, Patrick S; Stekler, Joanne D; Stephenson, Rob

    2015-01-01

    Background Gay, bisexual, and other men who have sex with men (MSM) account for a disproportionate burden of new HIV infections in the United States. Mobile technology presents an opportunity for innovative interventions for HIV prevention. Some HIV prevention apps currently exist; however, it is challenging to encourage users to download these apps and use them regularly. An iterative research process that centers on the community’s needs and preferences may increase the uptake, adherence, and ultimate effectiveness of mobile apps for HIV prevention. Objective The aim of this paper is to provide a case study to illustrate how an iterative community approach to a mobile HIV prevention app can lead to changes in app content to appropriately address the needs and the desires of the target community. Methods In this three-phase study, we conducted focus group discussions (FGDs) with MSM and HIV testing counselors in Atlanta, Seattle, and US rural regions to learn preferences for building a mobile HIV prevention app. We used data from these groups to build a beta version of the app and theater tested it in additional FGDs. A thematic data analysis examined how this approach addressed preferences and concerns expressed by the participants. Results There was an increased willingness to use the app during theater testing than during the first phase of FGDs. Many concerns that were identified in phase one (eg, disagreements about reminders for HIV testing, concerns about app privacy) were considered in building the beta version. Participants perceived these features as strengths during theater testing. However, some disagreements were still present, especially regarding the tone and language of the app. Conclusions These findings highlight the benefits of using an interactive and community-driven process to collect data on app preferences when building a mobile HIV prevention app. Through this process, we learned how to be inclusive of the larger MSM population without

  1. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  2. Minority Utility Rate Design Assessment Model

    SciTech Connect

    Poyer, David A.; Butler, John G.

    2003-01-20

    Econometric model simulates consumer demand response to various user-supplied, two-part tariff electricity rate designs and assesses their economic welfare impact on black, hispanic, poor and majority households.

  3. A Reflexive Model for Teaching Instructional Design.

    ERIC Educational Resources Information Center

    Shambaugh, Neal; Magliaro, Susan

    2001-01-01

    Documents a five-year study of two instructors who collaborated on formally studying their teaching of a master's level instructional design course. Outlines their views on learning, teaching, and instructional design (ID), describes the ID course, and explains the reflexive instructional model used, in which the teachers examined their teaching…

  4. A Model for Teaching Information Design

    ERIC Educational Resources Information Center

    Pettersson, Rune

    2011-01-01

    The author presents his views on the teaching of information design. The starting point includes some general aspects of teaching and learning. The multidisciplinary structure and content of information design as well as the combined practical and theoretical components influence studies of the discipline. Experiences from working with a model for…

  5. Modeling User Interactions with Instructional Design Software.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    As one of a series of studies being conducted to develop a useful (predictive) model of the instructional design process that is appropriate to military technical training settings, this study performed initial evaluations on two pieces of instructional design software developed by M. David Merrill and colleagues at Utah State University i.e.,…

  6. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  7. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1993-01-01

    Over the past several years, it has been the primary goal of this grant to design and implement software to be used in the conceptual design of aerospace vehicles. The work carried out under this grant was performed jointly with members of the Vehicle Analysis Branch (VAB) of NASA LaRC, Computer Sciences Corp., and Vigyan Corp. This has resulted in the development of several packages and design studies. Primary among these are the interactive geometric modeling tool, the Solid Modeling Aerospace Research Tool (smart), and the integration and execution tools provided by the Environment for Application Software Integration and Execution (EASIE). In addition, it is the purpose of the personnel of this grant to provide consultation in the areas of structural design, algorithm development, and software development and implementation, particularly in the areas of computer aided design, geometric surface representation, and parallel algorithms.

  8. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  9. The Ecosystem Model: Designing Campus Environments.

    ERIC Educational Resources Information Center

    Western Interstate Commission for Higher Education, Boulder, CO.

    This document stresses the increasing awareness in higher education of the impact student/environment transactions have upon the quality of educational life and details a model and design process for creating a better fit between educational environments and students. The ecosystem model uses an interdisciplinary approach for the make-up of its…

  10. Interior Design Research: A Human Ecosystem Model.

    ERIC Educational Resources Information Center

    Guerin, Denise A.

    1992-01-01

    The interior ecosystems model illustrates effects on the human organism of the interaction of the natural, behavioral, and built environment. Examples of interior lighting and household energy consumption show the model's flexibility for organizing study variables in interior design research. (SK)

  11. Instructional Design in Education: New Model

    ERIC Educational Resources Information Center

    Isman, Aytekin

    2011-01-01

    The main goal of the new instructional design model is to organize long term and full learning activities. The new model is based on the theoretical foundation of behaviorism, cognitivism and constructivism. During teaching and learning activities, learners are active and use cognitive, constructivist, or behaviorist learning to construct new…

  12. SR-7A aeroelastic model design report

    NASA Technical Reports Server (NTRS)

    Nagle, D.; Auyeung, S.; Turnberg, J.

    1986-01-01

    A scale model was designed to simulate the aeroelastic characteristics and performance of the 2.74 meter (9 ft.) diameter SR-7L blade. The procedures used in this model blade design are discussed. Included in this synopsis is background information concerning scaling parameters and an explanation of manufacturing limitations. A description of the final composite model blade, made of titanium, fiberglass, and graphite, is provided. Analytical methods for determining the blade stresses, natural frequencies and mode shapes, and stability are discussed at length.

  13. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  14. Propulsion System Models for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2014-01-01

    The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.

  15. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  16. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  17. Designing and encoding models for synthetic biology

    PubMed Central

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-01-01

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology ‘loop’. PMID:19364720

  18. Global optimization of bilinear engineering design models

    SciTech Connect

    Grossmann, I.; Quesada, I.

    1994-12-31

    Recently Quesada and Grossmann have proposed a global optimization algorithm for solving NLP problems involving linear fractional and bilinear terms. This model has been motivated by a number of applications in process design. The proposed method relies on the derivation of a convex NLP underestimator problem that is used within a spatial branch and bound search. This paper explores the use of alternative bounding approximations for constructing the underestimator problem. These are applied in the global optimization of problems arising in different engineering areas and for which different relaxations are proposed depending on the mathematical structure of the models. These relaxations include linear and nonlinear underestimator problems. Reformulations that generate additional estimator functions are also employed. Examples from process design, structural design, portfolio investment and layout design are presented.

  19. Panoramic imaging perimeter sensor design and modeling

    SciTech Connect

    Pritchard, D.A.

    1993-12-31

    This paper describes the conceptual design and preliminary performance modeling of a 360-degree imaging sensor. This sensor combines automatic perimeter intrusion detection with immediate visual assessment and is intended to be used for fast deployment around fixed or temporary high-value assets. The sensor requirements, compiled from various government agencies, are summarized. The conceptual design includes longwave infrared and visible linear array technology. An auxiliary millimeter-wave sensing technology is also considered for use during periods of infrared and visible obscuration. The infrared detectors proposed for the sensor design are similar to the Standard Advanced Dewar Assembly Types Three A and B (SADA-IIIA/B). An overview of the sensor and processor is highlighted. The infrared performance of this sensor design has been predicted using existing thermal imaging system models and is described in the paper. Future plans for developing a prototype are also presented.

  20. A Biologically Inspired Network Design Model

    PubMed Central

    Zhang, Xiaoge; Adamatzky, Andrew; Chan, Felix T.S.; Deng, Yong; Yang, Hai; Yang, Xin-She; Tsompanas, Michail-Antisthenis I.; Sirakoulis, Georgios Ch.; Mahadevan, Sankaran

    2015-01-01

    A network design problem is to select a subset of links in a transport network that satisfy passengers or cargo transportation demands while minimizing the overall costs of the transportation. We propose a mathematical model of the foraging behaviour of slime mould P. polycephalum to solve the network design problem and construct optimal transport networks. In our algorithm, a traffic flow between any two cities is estimated using a gravity model. The flow is imitated by the model of the slime mould. The algorithm model converges to a steady state, which represents a solution of the problem. We validate our approach on examples of major transport networks in Mexico and China. By comparing networks developed in our approach with the man-made highways, networks developed by the slime mould, and a cellular automata model inspired by slime mould, we demonstrate the flexibility and efficiency of our approach. PMID:26041508

  1. Human visual performance model for crewstation design

    NASA Astrophysics Data System (ADS)

    Larimer, James O.; Prevost, Michael P.; Arditi, Aries R.; Azueta, Steven; Bergen, James R.; Lubin, Jeffrey

    1991-08-01

    In a cockpit, the crewstation of an airplane, the ability of the pilot to unambiguously perceive rapidly changing information both internal and external to the crewstation is critical. To assess the impact of crewstation design decisions on the pilot''s ability to perceive information, the designer needs a means of evaluating the trade-offs that result from different designs. The Visibility Modeling Tool (VMT) provides the designer with a CAD tool for assessing these trade-offs. It combines the technologies of computer graphics, computational geometry, human performance modeling and equipment modeling into a computer-based interactive design tool. Through a simple interactive interface, a designer can manipulate design parameters such as the geometry of the cockpit, environmental factors such as ambient lighting, pilot parameters such as point of regard and adaptation state, and equipment parameters such as the location of displays, their size and the contrast of displayed symbology. VMT provides an end-to-end analysis that answers questions such as ''Will the pilot be able to read the display?'' Performance data can be projected, in the form of 3D contours, into the crewstation graphic model, providing the designer with a footprint of the operator''s visual capabilities, defining, for example, the regions in which fonts of a particular type, size and contrast can be read without error. Geometrical data such as the pilot''s volume field of view, occlusions caused by facial geometry, helmet margins, and objects in the crewstation can also be projected into the crewstation graphic model with respect to the coordinates of the aviator''s eyes and fixation point. The intersections of the projections with objects in the crewstation, delineate the area of coverage, masking, or occlusion associated with the objects. Objects in the crewstation space can be projected onto models of the operator''s retinas. These projections can be used to provide the designer with the

  2. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  3. Preliminary shuttle structural dynamics modeling design study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and development of a structural dynamics model of the space shuttle are discussed. The model provides for early study of structural dynamics problems, permits evaluation of the accuracy of the structural and hydroelastic analysis methods used on test vehicles, and provides for efficiently evaluating potential cost savings in structural dynamic testing techniques. The discussion is developed around the modes in which major input forces and responses occur and the significant structural details in these modes.

  4. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  5. Community-driven learning activities, creating futures: 30,000 people can't be wrong - can they?

    PubMed

    Dowrick, Peter W

    2007-03-01

    A major vehicle for the practice of community psychology is through the organization of community-based activities. My colleagues and I have developed many programs for community learning centers, in-school and after school programs, and community technology centers. In the last 10 years, 30,000 people (mostly children) have participated in activities designed for enjoyment and learning, with a view to adding protective factors and reducing negative factors in at-risk communities. Development of these programs for literacy, education, life and work skills, has increasingly followed a community responsive model. Within each program, we created explicit images of future success. That is, people could see themselves being successful where they normally fail: self modeling with feedforward. Data reports show that individuals generalized and maintained their new skills and attitudes, but the sustainability of programs has been variable. Analysis of the variations indicates the importance of program level feedforward that brings the future into the present. The discussion includes consideration of how individual-level and community-level practices can inform each other. PMID:17437187

  6. Jovian plasma modeling for mission design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  7. Jovian Plasma Modeling for Mission Design

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Kim, Wousik; Belland, Brent; Evans, Robin

    2015-01-01

    The purpose of this report is to address uncertainties in the plasma models at Jupiter responsible for surface charging and to update the jovian plasma models using the most recent data available. The updated plasma environment models were then used to evaluate two proposed Europa mission designs for spacecraft charging effects using the Nascap-2k code. The original Divine/Garrett jovian plasma model (or "DG1", T. N. Divine and H. B. Garrett, "Charged particle distributions in Jupiter's magnetosphere," J. Geophys. Res., vol. 88, pp. 6889-6903,1983) has not been updated in 30 years, and there are known errors in the model. As an example, the cold ion plasma temperatures between approx.5 and 10 Jupiter radii (Rj) were found by the experimenters who originally published the data to have been underestimated by approx.2 shortly after publication of the original DG1 model. As knowledge of the plasma environment is critical to any evaluation of the surface charging at Jupiter, the original DG1 model needed to be updated to correct for this and other changes in our interpretation of the data so that charging levels could beproperly estimated using the Nascap-2k charging code. As an additional task, the Nascap-2k spacecraft charging tool has been adapted to incorporate the so-called Kappa plasma distribution function--an important component of the plasma model necessary to compute the particle fluxes between approx.5 keV and 100 keV (at the outset of this study,Nascap-2k did not directly incorporate this common representation of the plasma thus limiting the accuracy of our charging estimates). The updating of the DG1 model and its integration into the Nascap-2k design tool means that charging concerns can now be more efficiently evaluated and mitigated. (We note that, given the subsequent decision by the Europa project to utilize solar arrays for its baseline design, surface charging effects have becomeeven more of an issue for its mission design). The modifications and

  8. EUV Focus Sensor: Design and Modeling

    SciTech Connect

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.

  9. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  10. Design and Development Research: A Model Validation Case

    ERIC Educational Resources Information Center

    Tracey, Monica W.

    2009-01-01

    This is a report of one case of a design and development research study that aimed to validate an overlay instructional design model incorporating the theory of multiple intelligences into instructional systems design. After design and expert review model validation, The Multiple Intelligence (MI) Design Model, used with an Instructional Systems…

  11. Modelling, analyses and design of switching converters

    NASA Technical Reports Server (NTRS)

    Cuk, S. M.; Middlebrook, R. D.

    1978-01-01

    A state-space averaging method for modelling switching dc-to-dc converters for both continuous and discontinuous conduction mode is developed. In each case the starting point is the unified state-space representation, and the end result is a complete linear circuit model, for each conduction mode, which correctly represents all essential features, namely, the input, output, and transfer properties (static dc as well as dynamic ac small-signal). While the method is generally applicable to any switching converter, it is extensively illustrated for the three common power stages (buck, boost, and buck-boost). The results for these converters are then easily tabulated owing to the fixed equivalent circuit topology of their canonical circuit model. The insights that emerge from the general state-space modelling approach lead to the design of new converter topologies through the study of generic properties of the cascade connection of basic buck and boost converters.

  12. Cryogenic Wind Tunnel Models. Design and Fabrication

    NASA Technical Reports Server (NTRS)

    Young, C. P., Jr. (Compiler); Gloss, B. B. (Compiler)

    1983-01-01

    The principal motivating factor was the National Transonic Facility (NTF). Since the NTF can achieve significantly higher Reynolds numbers at transonic speeds than other wind tunnels in the world, and will therefore occupy a unique position among ground test facilities, every effort is being made to ensure that model design and fabrication technology exists to allow researchers to take advantage of this high Reynolds number capability. Since a great deal of experience in designing and fabricating cryogenic wind tunnel models does not exist, and since the experience that does exist is scattered over a number of organizations, there is a need to bring existing experience in these areas together and share it among all interested parties. Representatives from government, the airframe industry, and universities are included.

  13. Human visual performance model for crewstation design

    NASA Technical Reports Server (NTRS)

    Larimer, James; Prevost, Michael; Arditi, Aries; Azueta, Steven; Bergen, James; Lubin, Jeffrey

    1991-01-01

    An account is given of a Visibility Modeling Tool (VMT) which furnishes a crew-station designer with the means to assess configurational tradeoffs, with a view to the impact of various options on the unambiguous access of information to the pilot. The interactive interface of the VMT allows the manipulation of cockpit geometry, ambient lighting, pilot ergonomics, and the displayed symbology. Performance data can be displayed in the form of 3D contours into the crewstation graphic model, thereby yielding an indication of the operator's visual capabilities.

  14. Rainwater harvesting: model-based design evaluation.

    PubMed

    Ward, S; Memon, F A; Butler, D

    2010-01-01

    The rate of uptake of rainwater harvesting (RWH) in the UK has been slow to date, but is expected to gain momentum in the near future. The designs of two different new-build rainwater harvesting systems, based on simple methods, are evaluated using three different design methods, including a continuous simulation modelling approach. The RWH systems are shown to fulfill 36% and 46% of WC demand. Financial analyses reveal that RWH systems within large commercial buildings maybe more financially viable than smaller domestic systems. It is identified that design methods based on simple approaches generate tank sizes substantially larger than the continuous simulation. Comparison of the actual tank sizes and those calculated using continuous simulation established that the tanks installed are oversized for their associated demand level and catchment size. Oversizing tanks can lead to excessive system capital costs, which currently hinders the uptake of systems. Furthermore, it is demonstrated that the catchment area size is often overlooked when designing UK-based RWH systems. With respect to these findings, a recommendation for a transition from the use of simple tools to continuous simulation models is made.

  15. Thermal Transport Model for Heat Sink Design

    NASA Technical Reports Server (NTRS)

    Chervenak, James A.; Kelley, Richard L.; Brown, Ari D.; Smith, Stephen J.; Kilbourne, Caroline a.

    2009-01-01

    A document discusses the development of a finite element model for describing thermal transport through microcalorimeter arrays in order to assist in heat-sinking design. A fabricated multi-absorber transition edge sensor (PoST) was designed in order to reduce device wiring density by a factor of four. The finite element model consists of breaking the microcalorimeter array into separate elements, including the transition edge sensor (TES) and the silicon substrate on which the sensor is deposited. Each element is then broken up into subelements, whose surface area subtends 10 10 microns. The heat capacity per unit temperature, thermal conductance, and thermal diffusivity of each subelement are the model inputs, as are the temperatures of each subelement. Numerical integration using the Finite in Time Centered in Space algorithm of the thermal diffusion equation is then performed in order to obtain a temporal evolution of the subelement temperature. Thermal transport across interfaces is modeled using a thermal boundary resistance obtained using the acoustic mismatch model. The document concludes with a discussion of the PoST fabrication. PoSTs are novel because they enable incident x-ray position sensitivity with good energy resolution and low wiring density.

  16. Understanding backward design to strengthen curricular models.

    PubMed

    Emory, Jan

    2014-01-01

    Nurse educators have responded to the call for transformation in education. Challenges remain in planning curricular implementation to facilitate understanding of essential content for student success on licensure examinations and in professional practice. The conceptual framework Backward Design (BD) can support and guide curriculum decisions. Using BD principles in conjunction with educational models can strengthen and improve curricula. This article defines and describes the BD process, and identifies reported benefits for nursing education. PMID:24743175

  17. Design and modelling of a SMES coil

    NASA Astrophysics Data System (ADS)

    Yuan, Weijia; Campbell, A. M.; Coombs, T. A.

    2010-06-01

    The design of a Superconducting Magnetic Energy Storage (SMES) coil wound by coated conductors has been presented. Based on an existing model for coated conductor pancake coils, this paper analysed the magnetic field and current density distribution of the coil at two different operation temperatures, 77K and 22K. A comparison table of the critical currents and AC losses at these two temperatures has been presented. Several steps to improve the transport current of the coil have been suggested as well.

  18. Computerized design of controllers using data models

    NASA Technical Reports Server (NTRS)

    Irwin, Dennis; Mitchell, Jerrel; Medina, Enrique; Allwine, Dan; Frazier, Garth; Duncan, Mark

    1995-01-01

    The major contributions of the grant effort have been the enhancement of the Compensator Improvement Program (CIP), which resulted in the Ohio University CIP (OUCIP) package, and the development of the Model and Data-Oriented Computer Aided Design System (MADCADS). Incorporation of direct z-domain designs into CIP was tested and determined to be numerically ill-conditioned for the type of lightly damped problems for which the development was intended. Therefore, it was decided to pursue the development of z-plane designs in the w-plane, and to make this conversion transparent to the user. The analytical development needed for this feature, as well as that needed for including compensator damping ratios and DC gain specifications, closed loop stability requirements, and closed loop disturbance rejection specifications into OUCIP are all contained in Section 3. OUCIP was successfully tested with several example systems to verify proper operation of existing and new features. The extension of the CIP philosophy and algorithmic approach to handle modern multivariable controller design criteria was implemented and tested. Several new algorithms for implementing the search approach to modern multivariable control system design were developed and tested. This analytical development, most of which was incorporated into the MADCADS software package, is described in Section 4, which also includes results of the application of MADCADS to the MSFC ACES facility and the Hubble Space Telescope.

  19. Models of Design: Envisioning a Future Design Education

    ERIC Educational Resources Information Center

    Friedman, Ken

    2012-01-01

    This article offers a large-scale view of how design fits in the world economy today, and the role of design education in preparing designers for their economic and professional role. The current context of design involves broad-based historical changes including a major redistribution of geopolitical and industrial power from the West to the…

  20. Model Reduction for Control System Design

    NASA Technical Reports Server (NTRS)

    Enns, D. F.

    1985-01-01

    An approach and a technique for effectively obtaining reduced order mathematical models of a given large order model for the purposes of synthesis, analysis and implementation of control systems is developed. This approach involves the use of an error criterion which is the H-infinity norm of a frequency weighted error between the full and reduced order models. The weightings are chosen to take into account the purpose for which the reduced order model is intended. A previously unknown error bound in the H-infinity norm for reduced order models obtained from internally balanced realizations was obtained. This motivated further development of the balancing technique to include the frequency dependent weightings. This resulted in the frequency weighted balanced realization and a new model reduction technique. Two approaches to designing reduced order controllers were developed. The first involves reducing the order of a high order controller with an appropriate weighting. The second involves linear quadratic Gaussian synthesis based on a reduced order model obtained with an appropriate weighting.

  1. Model-Based Design of Biochemical Microreactors

    PubMed Central

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M.; Voll, Lars M.; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  2. Model-Based Design of Biochemical Microreactors.

    PubMed

    Elbinger, Tobias; Gahn, Markus; Neuss-Radu, Maria; Hante, Falk M; Voll, Lars M; Leugering, Günter; Knabner, Peter

    2016-01-01

    Mathematical modeling of biochemical pathways is an important resource in Synthetic Biology, as the predictive power of simulating synthetic pathways represents an important step in the design of synthetic metabolons. In this paper, we are concerned with the mathematical modeling, simulation, and optimization of metabolic processes in biochemical microreactors able to carry out enzymatic reactions and to exchange metabolites with their surrounding medium. The results of the reported modeling approach are incorporated in the design of the first microreactor prototypes that are under construction. These microreactors consist of compartments separated by membranes carrying specific transporters for the input of substrates and export of products. Inside the compartments of the reactor multienzyme complexes assembled on nano-beads by peptide adapters are used to carry out metabolic reactions. The spatially resolved mathematical model describing the ongoing processes consists of a system of diffusion equations together with boundary and initial conditions. The boundary conditions model the exchange of metabolites with the neighboring compartments and the reactions at the surface of the nano-beads carrying the multienzyme complexes. Efficient and accurate approaches for numerical simulation of the mathematical model and for optimal design of the microreactor are developed. As a proof-of-concept scenario, a synthetic pathway for the conversion of sucrose to glucose-6-phosphate (G6P) was chosen. In this context, the mathematical model is employed to compute the spatio-temporal distributions of the metabolite concentrations, as well as application relevant quantities like the outflow rate of G6P. These computations are performed for different scenarios, where the number of beads as well as their loading capacity are varied. The computed metabolite distributions show spatial patterns, which differ for different experimental arrangements. Furthermore, the total output of G6P

  3. Design of the UCLA general circulation model

    NASA Technical Reports Server (NTRS)

    Arakawa, A.

    1972-01-01

    An edited version is reported of notes distributed at the Summer Workshop on the UCLA General Circulation Model in June 1971. It presents the computational schemes of the UCLA model, along with the mathematical and physical principles on which these schemes are based. Included are the finite difference schemes for the governing fluid-dynamical equations, designed to maintain the important integral constraints and dispersion characteristics of the motion. Also given are the principles of parameterization of cumulus convection by an ensemble of identical clouds. A model of the ground hydrology, involving the liquid, ice and snow states of water, is included. A short summary is given of the scheme for computing solar and infrared radiation transfers through clear and cloudy air.

  4. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  5. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  6. Making designer mutants in model organisms.

    PubMed

    Peng, Ying; Clark, Karl J; Campbell, Jarryd M; Panetta, Magdalena R; Guo, Yi; Ekker, Stephen C

    2014-11-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

  7. Terahertz metamaterials: design, implementation, modeling and applications

    NASA Astrophysics Data System (ADS)

    Hokmabadi, Mohammad P.; Balci, Soner; Kim, Juhyung; Philip, Elizabath; Rivera, Elmer; Zhu, Muliang; Kung, Patrick; Kim, Seongsin M.

    2016-04-01

    Sub-wavelength metamaterial structures are of great fundamental and practical interest because of their ability to manipulate the propagation of electromagnetic waves. We review here our recent work on the design, simulation, implementation and equivalent circuit modeling of metamaterial devices operating at Terahertz frequencies. THz metamaterials exhibiting plasmon-induced transparency are realized through the hybridization of double split ring resonators on either silicon or flexible polymer substrates and exhibiting slow light properties. THz metamaterials perfect absorbers and stereometamaterials are realized with multifunctional specifications such as broadband absorbing, switching, and incident light polarization selectivity.

  8. Generalized mathematical models in design optimization

    NASA Technical Reports Server (NTRS)

    Papalambros, Panos Y.; Rao, J. R. Jagannatha

    1989-01-01

    The theory of optimality conditions of extremal problems can be extended to problems continuously deformed by an input vector. The connection between the sensitivity, well-posedness, stability and approximation of optimization problems is steadily emerging. The authors believe that the important realization here is that the underlying basis of all such work is still the study of point-to-set maps and of small perturbations, yet what has been identified previously as being just related to solution procedures is now being extended to study modeling itself in its own right. Many important studies related to the theoretical issues of parametric programming and large deformation in nonlinear programming have been reported in the last few years, and the challenge now seems to be in devising effective computational tools for solving these generalized design optimization models.

  9. Model to Design Drip Hose Lateral Line

    NASA Astrophysics Data System (ADS)

    Ludwig, Rafael; Cury Saad, João Carlos

    2014-05-01

    Introduction The design criterion for non-pressure compensating drip hose is normally to have 10% of flow variation (Δq) in the lateral line, corresponding to 20% of head pressure variation (ΔH). Longer lateral lines in drip irrigation systems using conventional drippers provide cost reduction, but it is necessary to obtain to the uniformity of irrigation [1]. The use of Δq higher levels can provide longer lateral lines. [4] proposes the use of a 30% Δq and he found that this value resulted in distribution uniformity over 80%. [1] considered it is possible to extend the lateral line length using two emitters spacing in different section. He assumed that the spacing changing point would be at 40% of the total length, because this is approximately the location of the average flow according with [2]. [3] found that, for practical purposes, the average pressure is located at 40% of the length of the lateral line and that until this point it has already consumed 75% of total pressure head loss (hf ). In this case, the challenge for designers is getting longer lateral lines with high values of uniformity. Objective The objective of this study was to develop a model to design longer lateral lines using non-pressure compensating drip hose. Using the developed model, the hypotheses to be evaluated were: a) the use of two different spacing between emitters in the same lateral line allows longer length; b) it is possible to get longer lateral lines using high values of pressure variation in the lateral lines since the distribution uniformity stays below allowable limits. Methodology A computer program was developed in Delphi® based on the model developed and it is able to design lateral lines in level using non-pressure compensating drip hose. The input data are: desired distribution uniformity (DU); initial and final pressure in the lateral line; coefficients of relationship between emitter discharge and pressure head; hose internal diameter; pipe cross-sectional area

  10. Modeling and control design of a wind tunnel model support

    NASA Technical Reports Server (NTRS)

    Howe, David A.

    1990-01-01

    The 12-Foot Pressure Wind Tunnel at Ames Research Center is being restored. A major part of the restoration is the complete redesign of the aircraft model supports and their associated control systems. An accurate trajectory control servo system capable of positioning a model (with no measurable overshoot) is needed. Extremely small errors in scaled-model pitch angle can increase airline fuel costs for the final aircraft configuration by millions of dollars. In order to make a mechanism sufficiently accurate in pitch, a detailed structural and control-system model must be created and then simulated on a digital computer. The model must contain linear representations of the mechanical system, including masses, springs, and damping in order to determine system modes. Electrical components, both analog and digital, linear and nonlinear must also be simulated. The model of the entire closed-loop system must then be tuned to control the modes of the flexible model-support structure. The development of a system model, the control modal analysis, and the control-system design are discussed.

  11. Making designer mutants in model organisms

    PubMed Central

    Peng, Ying; Clark, Karl J.; Campbell, Jarryd M.; Panetta, Magdalena R.; Guo, Yi; Ekker, Stephen C.

    2014-01-01

    Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. In this Review, we summarize the common approaches and applications of these still-evolving tools as they are being used in the most popular model developmental systems. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms. PMID:25336735

  12. Xylose fermentation: Analysis, modelling, and design

    SciTech Connect

    Slininger, P.J.W.

    1988-01-01

    Ethanolic fermentation is a means of utilizing xylose-rich industrial wastes, but an optimized bioprocess is lacking. Pachysolen tannophilus NRRL Y-7124 was the first yeast discovered capable of significant ethanol production from xylose and has served as a model for studies of other yeasts mediating this conversion. However, a comparative evaluation of strains led the authors to focus on Pichia stipitis NRRL Y-7124 as the yeast with highest potential for application. Given 150 g/l xylose in complex medium, strain Y-7124 functioned optimally at 25-26C pH 4-7 to accumulate 56 g/l ethanol with negligible xylitol production. Dissolved oxygen concentration was critical to cell growth; and in order to measure it accurately, a colorimetric assay was developed to allow calibration of electrodes based on oxygen solubility in media of varying composition. Specific growth rate was a Monod function of limiting substrate concentration (oxygen and/or xylose). Both specific ethanol productivity and oxygen uptake rate were growth-associated, but only the former was maintenance-associated. Both growth and fermentation were inhibited by high xylose and ethanol concentrations. Carbon and cofactor balances supported modelling xylose metabolism as a combination of four processes: assimilation, pentose phosphate oxidation, respiration, and ethanolic fermentation. A mathematical model describing the stoichiometry and kinetics was constructed, and its predictive capacity was confirmed by comparing simulated and experimental batch cultures. Consideration of example processes indicated that this model constitutes an important tool for designing the optimum bioprocess for utilizing xylose-rich wastes.

  13. Micromachined accelerometer design, modeling and validation

    SciTech Connect

    Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

    1998-04-01

    Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

  14. The SAFE project: community-driven partnerships in health, mental health, and education to prevent early school failure.

    PubMed

    Poole, D L

    1997-11-01

    This article presents a case study of an innovative school-based health and mental health project that prevents early school failure in one county in Oklahoma. Success is attributed to social work development of broad-based partnerships involving families, schools, communities, and public policy officials. Citizen-driven, these partnerships have meshed previously fixed institutional boundaries in health, mental health, and education to prevent early school failure. The article describes school-family partnerships that form the core of the project's service intervention model. Statistics on service activities and outcomes are presented, along with a discussion of lessons learned for implementation of the project.

  15. A Design Model: The Autism Spectrum Disorder Classroom Design Kit

    ERIC Educational Resources Information Center

    McAllister, Keith; Maguire, Barry

    2012-01-01

    Architects and designers have a responsibility to provide an inclusive built environment. However, for those with a diagnosis of autism spectrum disorder (ASD), the built environment can be a frightening and confusing place, difficult to negotiate and tolerate. The challenge of integrating more fully into society is denied by an alienating built…

  16. Advancing Long Tail Data Capture and Access Through Trusted, Community-Driven Data Services at the IEDA Data Facility

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Ferrini, V.; Hsu, L.; Arko, R. A.; Walker, J. D.; O'hara, S. H.

    2012-12-01

    Substantial volumes of data in the Earth Sciences are collected in small- to medium-size projects by individual investigators or small research teams, known as the 'Long Tail' of science. Traditionally, these data have largely stayed 'in the dark', i.e. they have not been properly archived, and have therefore been inaccessible and underutilized. The primary reason has been the lack of appropriate infrastructure, from adequate repositories to resources and support for investigators to properly manage their data, to community standards and best practices. Lack of credit for data management and for the data themselves has contributed to the reluctance of investigators to share their data. IEDA (Integrated Earth Data Applications), a NSF-funded data facility for solid earth geoscience data, has developed a comprehensive suite of data services that are designed to address the concerns and needs of investigators. IEDA's data publication service registers datasets with DOI and ensures their proper citation and attribution. IEDA is working with publishers on advanced linkages between datasets in the IEDA repository and scientific online articles to facilitate access to the data, enhance their visibility, and augment their use and citation. IEDA's investigator support ranges from individual support for data management to tools, tutorials, and virtual or face-to-face workshops that guide and assist investigators with data management planning, data submission, and data documentation. A critical aspect of IEDA's concept has been the disciplinary expertise within the team and its strong liaison with the science community, as well as a community-based governance. These have been fundamental to gain the trust and support of the community that have lead to significantly improved data preservation and access in the communities served by IEDA.

  17. Graphical Models for Quasi-Experimental Designs

    ERIC Educational Resources Information Center

    Kim, Yongnam; Steiner, Peter M.; Hall, Courtney E.; Su, Dan

    2016-01-01

    Experimental and quasi-experimental designs play a central role in estimating cause-effect relationships in education, psychology, and many other fields of the social and behavioral sciences. This paper presents and discusses the causal graphs of experimental and quasi-experimental designs. For quasi-experimental designs the authors demonstrate…

  18. Regional hyperthermia applicator design using FDTD modelling.

    PubMed

    Kroeze, H; Van de Kamer, J B; De Leeuw, A A; Lagendijk, J J

    2001-07-01

    Recently published results confirm the positive effect of regional hyperthermia combined with external radiotherapy on pelvic tumours. Several studies have been published on the improvement of RF annular array applicator systems with dipoles and a closed water bolus. This study investigates the performance of a next-generation applicator system for regional hyperthermia with a multi-ring annular array of antennas and an open water bolus. A cavity slot antenna is introduced to enhance the directivity and reduce mutual coupling between the antennas. Several design parameters, i.e. dimensions, number of antennas and operating frequency, have been evaluated using several patient models. Performance indices have been defined to evaluate the effect of parameter variation on the specific absorption rate (SAR) distribution. The performance of the new applicator type is compared with the Coaxial TEM. Operating frequency appears to be the main parameter with a positive influence on the performance. A SAR increase in tumour of 1.7 relative to the Coaxial TEM system can be obtained with a three-ring, six-antenna per ring cavity slot applicator operating at 150 MHz. PMID:11474934

  19. Employing ISRU Models to Improve Hardware Design

    NASA Technical Reports Server (NTRS)

    Linne, Diane L.

    2010-01-01

    An analytical model for hydrogen reduction of regolith was used to investigate the effects of several key variables on the energy and mass performance of reactors for a lunar in-situ resource utilization oxygen production plant. Reactor geometry, reaction time, number of reactors, heat recuperation, heat loss, and operating pressure were all studied to guide hardware designers who are developing future prototype reactors. The effects of heat recuperation where the incoming regolith is pre-heated by the hot spent regolith before transfer was also investigated for the first time. In general, longer reaction times per batch provide a lower overall energy, but also result in larger and heavier reactors. Three reactors with long heat-up times results in similar energy requirements as a two-reactor system with all other parameters the same. Three reactors with heat recuperation results in energy reductions of 20 to 40 percent compared to a three-reactor system with no heat recuperation. Increasing operating pressure can provide similar energy reductions as heat recuperation for the same reaction times.

  20. Regional hyperthermia applicator design using FDTD modelling.

    PubMed

    Kroeze, H; Van de Kamer, J B; De Leeuw, A A; Lagendijk, J J

    2001-07-01

    Recently published results confirm the positive effect of regional hyperthermia combined with external radiotherapy on pelvic tumours. Several studies have been published on the improvement of RF annular array applicator systems with dipoles and a closed water bolus. This study investigates the performance of a next-generation applicator system for regional hyperthermia with a multi-ring annular array of antennas and an open water bolus. A cavity slot antenna is introduced to enhance the directivity and reduce mutual coupling between the antennas. Several design parameters, i.e. dimensions, number of antennas and operating frequency, have been evaluated using several patient models. Performance indices have been defined to evaluate the effect of parameter variation on the specific absorption rate (SAR) distribution. The performance of the new applicator type is compared with the Coaxial TEM. Operating frequency appears to be the main parameter with a positive influence on the performance. A SAR increase in tumour of 1.7 relative to the Coaxial TEM system can be obtained with a three-ring, six-antenna per ring cavity slot applicator operating at 150 MHz.

  1. Reshaping the Boundaries of Community Engagement in Design Education: Global and Local Explorations

    ERIC Educational Resources Information Center

    Hicks, Travis L.; Radtke, Rebekah Ison

    2015-01-01

    Community-driven design is a current movement in the forefront of many designers' practices and on university campuses in design programs. The authors examine work from their respective public state universities' design programs as examples of best practices. In these case studies, the authors share experiences using community-based design…

  2. Rapid Modeling, Assembly and Simulation in Design Optimization

    NASA Technical Reports Server (NTRS)

    Housner, Jerry

    1997-01-01

    A new capability for design is reviewed. This capability provides for rapid assembly of detail finite element models early in the design process where costs are most effectively impacted. This creates an engineering environment which enables comprehensive analysis and design optimization early in the design process. Graphical interactive computing makes it possible for the engineer to interact with the design while performing comprehensive design studies. This rapid assembly capability is enabled by the use of Interface Technology, to couple independently created models which can be archived and made accessible to the designer. Results are presented to demonstrate the capability.

  3. A Workforce Design Model: Providing Energy to Organizations in Transition

    ERIC Educational Resources Information Center

    Halm, Barry J.

    2011-01-01

    The purpose of this qualitative study was to examine the change in performance realized by a professional services organization, which resulted in the Life Giving Workforce Design (LGWD) model through a grounded theory research design. This study produced a workforce design model characterized as an organizational blueprint that provides virtuous…

  4. Construction of an Instructional Design Model for Undergraduate Chemistry Laboratory Design: A Delphi Approach

    ERIC Educational Resources Information Center

    Bunag, Tara

    2012-01-01

    The purpose of this study was to construct an instructional systems design model for chemistry teaching laboratories at the undergraduate level to accurately depict the current practices of design experts. This required identifying the variables considered during design, prioritizing and ordering these variables, and constructing a model. Experts…

  5. Designing Electronic Performance Support Systems: Models and Instructional Strategies Employed

    ERIC Educational Resources Information Center

    Nekvinda, Christopher D.

    2011-01-01

    The purpose of this qualitative study was to determine whether instructional designers and performance technologists utilize instructional design models when designing and developing electronic performance support systems (EPSS). The study also explored if these same designers were utilizing instructional strategies within their EPSS to support…

  6. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  7. Non-Linear Instructional Design Model: Eternal, Synergistic Design and Development

    ERIC Educational Resources Information Center

    Crawford, Caroline

    2004-01-01

    Instructional design is at the heart of each educational endeavour. This process revolves around the steps through which the thoughtful productions of superior products are created. The ADDIE generic instructional design model emphasises five basic steps within the instructional design process: analyse, design, develop, implement and evaluate. The…

  8. Designing and Evaluating Representations to Model Pedagogy

    ERIC Educational Resources Information Center

    Masterman, Elizabeth; Craft, Brock

    2013-01-01

    This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit…

  9. Aftbody Closure Model Design: Lessons Learned

    NASA Technical Reports Server (NTRS)

    Capone, Francis J.

    1999-01-01

    An Aftbody Closure Test Program is necessary in order to provide aftbody drag increments that can be added to the drag polars produced by testing the performance models (models 2a and 2b). These models had a truncated fuselage, thus, drag was measured for an incomplete configuration. In addition, trim characteristics cannot be determined with a model with a truncated fuselage. The stability and control tests were conducted with a model (model 20) having a flared aftbody. This type aftbody was needed in order to provide additional clearance between the base of the model and the sting. This was necessary because the high loads imposed on the model for stability and control tests result in large model deflections. For this case, the aftbody model will be used to validate stability and control performance.

  10. Shuttle passenger couch. [design and performance of engineering model

    NASA Technical Reports Server (NTRS)

    Rosener, A. A.; Stephenson, M. L.

    1974-01-01

    Conceptual design and fabrication of a full scale shuttle passenger couch engineering model are reported. The model was utilized to verify anthropometric dimensions, reach dimensions, ingress/egress, couch operation, storage space, restraint locations, and crew acceptability. These data were then incorported in the design of the passenger couch verification model that underwent performance tests.

  11. Integrating Surface Modeling into the Engineering Design Graphics Curriculum

    ERIC Educational Resources Information Center

    Hartman, Nathan W.

    2006-01-01

    It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…

  12. Integrating Science into Design Technology Projects: Using a Standard Model in the Design Process.

    ERIC Educational Resources Information Center

    Zubrowski, Bernard

    2002-01-01

    Fourth graders built a model windmill using a three-step process: (1) open exploration of designs; (2) application of a standard model incorporating features of suggested designs; and (3) refinement of preliminary models. The approach required math, science, and technology teacher collaboration and adequate time. (Contains 21 references.) (SK)

  13. Simulation Tools Model Icing for Aircraft Design

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Here s a simple science experiment to try: Place an unopened bottle of distilled water in your freezer. After 2-3 hours, if the water is pure enough, you will notice that it has not frozen. Carefully pour the water into a bowl with a piece of ice in it. When it strikes the ice, the water will instantly freeze. One of the most basic and commonly known scientific facts is that water freezes at around 32 F. But this is not always the case. Water lacking any impurities for ice crystals to form around can be supercooled to even lower temperatures without freezing. High in the atmosphere, water droplets can achieve this delicate, supercooled state. When a plane flies through clouds containing these droplets, the water can strike the airframe and, like the supercooled water hitting the ice in the experiment above, freeze instantly. The ice buildup alters the aerodynamics of the plane - reducing lift and increasing drag - affecting its performance and presenting a safety issue if the plane can no longer fly effectively. In certain circumstances, ice can form inside aircraft engines, another potential hazard. NASA has long studied ways of detecting and countering atmospheric icing conditions as part of the Agency s efforts to enhance aviation safety. To do this, the Icing Branch at Glenn Research Center utilizes a number of world-class tools, including the Center s Icing Research Tunnel and the NASA 607 icing research aircraft, a "flying laboratory" for studying icing conditions. The branch has also developed a suite of software programs to help aircraft and icing protection system designers understand the behavior of ice accumulation on various surfaces and in various conditions. One of these innovations is the LEWICE ice accretion simulation software. Initially developed in the 1980s (when Glenn was known as Lewis Research Center), LEWICE has become one of the most widely used tools in icing research and aircraft design and certification. LEWICE has been transformed over

  14. A statistical model for telecommunication link design

    NASA Technical Reports Server (NTRS)

    Yuen, J. H.

    1975-01-01

    An evaluation is conducted of the current telecommunication link design technique and a description is presented of an alternative method, called the probability distribution method (PDM), which is free of the disadvantages of the current technique while retaining its advantages. The PDM preserves the simplicity of the design control table (DCT) format. The use of the DCT as a management design control tool is continued. The telecommunication link margin probability density function used presents the probability of achieving any particular value of link performance. It is, therefore, possible to assess the performance risk and other tradeoffs.

  15. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  16. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  17. Enhancing the support of interdisciplinary product design by using design object-oriented modelling

    SciTech Connect

    Yan, Xiu-Tian; MacCallum, K.J.

    1996-12-31

    This paper addresses the modelling difficulties faced by an designer in the development of new interdisciplinary products. It describes a novel approach to tackling these problems. The underlying methodology is based on the object-oriented technology. A unified design object based representation and modelling method is proposed to enhance the scope of product modelling to many phases of the design process, and to simplify the modelling complexity associated with the increased number of modelling methods directly adopted from individual disciplines. This unified modelling representation method encompasses several low level modelling methods and it can integrate the dynamic energy and information system modelling of mechatronic products. These design object models will greatly facilitate designers, especially those who work on the development of interdisciplinary products, such as mechatronic systems. The paper concludes that a design object-oriented modelling method based on a hybrid modelling representation encompassing bond graph notation, block diagram, and Yourdon diagram, is desirable and feasible for mechatronic system design and modelling.

  18. The Training Wheel. A Simple Model for Instructional Design.

    ERIC Educational Resources Information Center

    Rogoff, Rosalind L.

    1984-01-01

    The author developed an instructional-design model consisting of four simple steps. The model is in a circular format, rather than the usual linear series form, so it is named the training wheel. (SSH)

  19. Data Base Design Using Entity-Relationship Models.

    ERIC Educational Resources Information Center

    Davis, Kathi Hogshead

    1983-01-01

    The entity-relationship (ER) approach to database design is defined, and a specific example of an ER model (personnel-payroll) is examined. The requirements for converting ER models into specific database management systems are discussed. (Author/MSE)

  20. Free-form Design in Solid Modelling

    NASA Technical Reports Server (NTRS)

    Pratt, M. J.

    1985-01-01

    Solid modelling is developed as a means of representing the shapes of components used in the less specialized mechanical engineering industries. Solids can now be modelled with free form surfaces. In some cases parametric geometry is used exclusively, while in others here is mixed use of parametric and implicit geometry. A method is suggested and discussed for free form solids modelling. The method has several advantages, one of which is that it avoids the use of detached surfaces, Boolean operations and surface intersection computations. It involves only minor topological changes to the model and is therefore computationally efficient.

  1. Humus and humility in ecosystem model design

    NASA Astrophysics Data System (ADS)

    Rowe, Ed

    2015-04-01

    Prediction is central to science. Empirical scientists couch their predictions as hypotheses and tend to deal with simple models such as regressions, but are modellers as much as are those who combine mechanistic hypotheses into more complex models. There are two main challenges for both groups: to strive for accurate predictions, and to ensure that the work is relevant to wider society. There is a role for blue-sky research, but the multiple environmental changes that characterise the 21st century place an onus on ecosystem scientists to develop tools for understanding environmental change and planning responses. Authors such as Funtowicz and Ravetz (1990) have argued that this situation represents "post-normal" science and that scientists should see themselves more humbly as actors within a societal process rather than as arbiters of truth. Modellers aim for generality, e.g. to accurately simulate the responses of a variety of ecosystems to several different environmental drivers. More accurate predictions can usually be achieved by including more explanatory factors or mechanisms in a model, even though this often results in a less efficient, less parsimonious model. This drives models towards ever-increasing complexity, and many models grow until they are effectively unusable beyond their development team. An alternative way forward is to focus on developing component models. Technologies for integrating dynamic models emphasise the removal of the model engine (algorithms) from code which handles time-stepping and the user interface. Developing components also requires some humility on the part of modellers, since collaboration will be needed to represent the whole system, and also because the idea that a simple component can or should represent the entire understanding of a scientific discipline is often difficult to accept. Policy-makers and land managers typically have questions different to those posed by scientists working within a specialism, and models

  2. Designing Games for Sport Education: Curricular Models

    ERIC Educational Resources Information Center

    Holt, Brett

    2005-01-01

    Sports Education is becoming a popular alternative curricular model in physical education, opposing the more traditional Multi-activity model. Physical education classes are slowly changing to include sport education. The change comes with the support of the community in the form of Sport Education in Physical Education Program (SEPEP). However,…

  3. Evaluating Models for Partially Clustered Designs

    ERIC Educational Resources Information Center

    Baldwin, Scott A.; Bauer, Daniel J.; Stice, Eric; Rohde, Paul

    2011-01-01

    Partially clustered designs, where clustering occurs in some conditions and not others, are common in psychology, particularly in prevention and intervention trials. This article reports results from a simulation comparing 5 approaches to analyzing partially clustered data, including Type I errors, parameter bias, efficiency, and power. Results…

  4. A Practical Design Model for Novice Teachers

    ERIC Educational Resources Information Center

    Brantley-Dias, Laurie; Calandra, Brendan

    2007-01-01

    Novice teachers encounter a variety of challenges and uncertainties, not limited to classroom management, cultural diversity, subject matter expertise, integrating technology, and instructional design. The act of planning, both physically and mentally, is a way to diminish these uncertainties. The purpose of this article is to suggest a design…

  5. A Review of Research on Universal Design Educational Models

    ERIC Educational Resources Information Center

    Rao, Kavita; Ok, Min Wook; Bryant, Brian R.

    2014-01-01

    Universal design for learning (UDL) has gained considerable attention in the field of special education, acclaimed for its promise to promote inclusion by supporting access to the general curriculum. In addition to UDL, there are two other universal design (UD) educational models referenced in the literature, universal design of instruction (UDI)…

  6. Designing Public Library Websites for Teens: A Conceptual Model

    ERIC Educational Resources Information Center

    Naughton, Robin Amanda

    2012-01-01

    The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in…

  7. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  8. A Constructivist Design and Learning Model: Time for a Graphic.

    ERIC Educational Resources Information Center

    Rogers, Patricia L.; Mack, Michael

    At the University of Minnesota, a model, visual representation or "graphic" that incorporated both a systematic design process and a constructivist approach was used as a framework for course design. This paper describes experiences of applying the Instructional Context Design (ICD) framework in both the K-12 and higher education settings. The…

  9. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  10. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  11. Multimedia Learning Design Pedagogy: A Hybrid Learning Model

    ERIC Educational Resources Information Center

    Tsoi, Mun Fie; Goh, Ngoh Khang; Chia, Lian Sai

    2005-01-01

    This paper provides insights on a hybrid learning model for multimedia learning design conceptualized from the Piagetian science learning cycle model and the Kolb's experiential learning model. This model represents learning as a cognitive process in a cycle of four phases, namely, Translating, Sculpting, Operationalizing, and Integrating and is…

  12. Conceptual Model Learning Objects and Design Recommendations for Small Screens

    ERIC Educational Resources Information Center

    Churchill, Daniel

    2011-01-01

    This article presents recommendations for the design of conceptual models for applications via handheld devices such as personal digital assistants and some mobile phones. The recommendations were developed over a number of years through experience that involves design of conceptual models, and applications of these multimedia representations with…

  13. Feedback Model to Support Designers of Blended-Learning Courses

    ERIC Educational Resources Information Center

    Hummel, Hans G. K.

    2006-01-01

    Although extensive research has been carried out, describing the role of feedback in education, and many theoretical models are yet available, procedures and guidelines for actually designing and implementing feedback in practice have remained scarce so far. This explorative study presents a preliminary six-phase design model for feedback…

  14. STELLA Experiment: Design and Model Predictions

    SciTech Connect

    Kimura, W. D.; Babzien, M.; Ben-Zvi, I.; Campbell, L. P.; Cline, D. B.; Fiorito, R. B.; Gallardo, J. C.; Gottschalk, S. C.; He, P.; Kusche, K. P.; Liu, Y.; Pantell, R. H.; Pogorelsky, I. V.; Quimby, D. C.; Robinson, K. E.; Rule, D. W.; Sandweiss, J.; Skaritka, J.; van Steenbergen, A.; Steinhauer, L. C.; Yakimenko, V.

    1998-07-05

    The STaged ELectron Laser Acceleration (STELLA) experiment will be one of the first to examine the critical issue of staging the laser acceleration process. The BNL inverse free electron laser (EEL) will serve as a prebuncher to generate {approx} 1 {micro}m long microbunches. These microbunches will be accelerated by an inverse Cerenkov acceleration (ICA) stage. A comprehensive model of the STELLA experiment is described. This model includes the EEL prebunching, drift and focusing of the microbunches into the ICA stage, and their subsequent acceleration. The model predictions will be presented including the results of a system error study to determine the sensitivity to uncertainties in various system parameters.

  15. Irradiation Design for an Experimental Murine Model

    SciTech Connect

    Ballesteros-Zebadua, P.; Moreno-Jimenez, S.; Suarez-Campos, J. E.; Celis, M. A.; Larraga-Gutierrez, J. M.; Garcia-Garduno, O. A.; Rubio-Osornio, M. C.; Custodio-Ramirez, V.; Paz, C.

    2010-12-07

    In radiotherapy and stereotactic radiosurgery, small animal experimental models are frequently used, since there are still a lot of unsolved questions about the biological and biochemical effects of ionizing radiation. This work presents a method for small-animal brain radiotherapy compatible with a dedicated 6MV Linac. This rodent model is focused on the research of the inflammatory effects produced by ionizing radiation in the brain. In this work comparisons between Pencil Beam and Monte Carlo techniques, were used in order to evaluate accuracy of the calculated dose using a commercial planning system. Challenges in this murine model are discussed.

  16. The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers

    ERIC Educational Resources Information Center

    Bird, Len

    2007-01-01

    This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…

  17. Agent-based models in robotized manufacturing cells designing

    NASA Astrophysics Data System (ADS)

    Sekala, A.; Gwiazda, A.; Foit, K.; Banas, W.; Hryniewicz, P.; Kost, G.

    2015-11-01

    The complexity of the components, presented in robotized manufacturing workcells, causes that already at the design phase is necessary to develop models presenting various aspects of their structure and functioning. These models are simplified representation of real systems and allow to, among others, systematize knowledge about the designed manufacturing workcell. They also facilitate defining and analyzing the interrelationships between its particular components. This paper proposes the agent-based approach applied for designing robotized manufacturing cells.

  18. Modelling and design for PM/EM magnetic bearings

    NASA Technical Reports Server (NTRS)

    Pang, D.; Kirk, J. A.; Anand, D. K.; Johnson, R. G.; Zmood, R. B.

    1992-01-01

    A mathematical model of a permanent magnet/electromagnet (PM/EM) radially active bearing is presented. The bearing is represented by both a reluctance model and a stiffness model. The reluctance model analyzes the magnetic circuit of the PM/EM bearings. By combining the two models, the performance of the bearing can be predicted given geometric dimensions, permanent magnet strength, and the parameters of the EM coils. The overall bearing design including the PM and EM design is subject to the performance requirement and physical constraints. A study of these requirements and constraints is discussed. The PM design is based on the required magnetic flux for proper geometric dimensions and magnet strength. The EM design is based on the stability and force slew rate consideration, and dictates the number of turns for the EM coils and the voltage and current of the power amplifier. An overall PM/EM bearing design methodology is proposed and a case study is also demonstrated.

  19. Design and modeling of the multibeam injector

    NASA Astrophysics Data System (ADS)

    Grote, D. P.; Kwan, J. W.; Westenskow, G. A.

    2007-07-01

    For both HIF and ion driven HEDP experiments, the beams a driver produces must have both high brightness and high current. The US HIF program has had extensive experience and success working with single, monolithic ion sources for accelerator experiments with moderate injector current requirements. Such sources produce up to hundreds of milliAmps with normalized emittances less than 1-π-mm mrad. However, with a need for higher current, and more compact, sources up to and over 1 A, monolithic sources begin to suffer from the poor scaling of source area to current. That is, by combining the limits of space-charge limited emission, voltage breakdown, and good optics, the source area is seen to scale as a high power of the current, A∝I. A means of bypassing the scaling, leading to a much more compact injector, is the use of multiple beamlets, each of which can have a much higher current density than a larger monolithic beam. The beamlets are merged near the end of the injector. A major challenge is the inherent emittance growth that occurs as the beamlets merge. In this paper, the design of such an injector will be presented along with simulations used to study and validate the design. This design offers other advantages, and some disadvantages, that will be described. Finally, comparisons will be made to experimental results from the merging beamlet experimental campaign on STS-500.

  20. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  1. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. PMID:27292581

  2. THE RHIC/AGS ONLINE MODEL ENVIRONMENT: DESIGN AND OVERVIEW.

    SciTech Connect

    SATOGATA,T.; BROWN,K.; PILAT,F.; TAFTI,A.A.; TEPIKIAN,S.; VAN ZEIJTS,J.

    1999-03-29

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV [1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters [2] around core computational modeling engines such as MAD and UAL/Teapot++ [3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC.

  3. Improving digital human modelling for proactive ergonomics in design.

    PubMed

    Chaffin, D B

    2005-04-15

    This paper presents the need to improve existing digital human models (DHMs) so they are better able to serve as effective ergonomics analysis and design tools. Existing DHMs are meant to be used by a designer early in a product development process when attempting to improve the physical design of vehicle interiors and manufacturing workplaces. The emphasis in this paper is placed on developing future DHMs that include valid posture and motion prediction models for various populations. It is argued that existing posture and motion prediction models now used in DHMs must be changed to become based on real motion data to assure validity for complex dynamic task simulations. It is further speculated that if valid human posture and motion prediction models are developed and used, these can be combined with psychophysical and biomechanical models to provide a much greater understanding of dynamic human performance and population specific limitations and that these new DHM models will ultimately provide a powerful ergonomics design tool.

  4. Meta-modelling the Medical Record: Design and Application

    PubMed Central

    Huet, Bernard; Lesueur, Bruno; Lebeux, Pierre; Blain, Gilles

    2000-01-01

    This project is based on a user-oriented design for medical record, thanks to a meta-model able to generate various models for an application domain. The meta-model is constituted of basic concepts: User Semantic Group, sentence-type, variable, graph. A beginning of implementation is in echocardiography report; The advantages are a very thorough personalization of the document for the user, and a greater independence of the design diagram from the technological platform.

  5. Designing Effective Learning Environments: Cognitive Apprenticeship Models.

    ERIC Educational Resources Information Center

    Berryman, Sue E.

    1991-01-01

    Using cognitive science as the knowledge base for the discussion, this paper reviews why many school learning situations are ineffective and introduces cognitive apprenticeship models that suggest what effective learning situations might look like. Five wrong assumptions about learning are examined: (1) people transfer learning from one situation…

  6. Stimulus design for model selection and validation in cell signaling.

    PubMed

    Apgar, Joshua F; Toettcher, Jared E; Endy, Drew; White, Forest M; Tidor, Bruce

    2008-02-01

    Mechanism-based chemical kinetic models are increasingly being used to describe biological signaling. Such models serve to encapsulate current understanding of pathways and to enable insight into complex biological processes. One challenge in model development is that, with limited experimental data, multiple models can be consistent with known mechanisms and existing data. Here, we address the problem of model ambiguity by providing a method for designing dynamic stimuli that, in stimulus-response experiments, distinguish among parameterized models with different topologies, i.e., reaction mechanisms, in which only some of the species can be measured. We develop the approach by presenting two formulations of a model-based controller that is used to design the dynamic stimulus. In both formulations, an input signal is designed for each candidate model and parameterization so as to drive the model outputs through a target trajectory. The quality of a model is then assessed by the ability of the corresponding controller, informed by that model, to drive the experimental system. We evaluated our method on models of antibody-ligand binding, mitogen-activated protein kinase (MAPK) phosphorylation and de-phosphorylation, and larger models of the epidermal growth factor receptor (EGFR) pathway. For each of these systems, the controller informed by the correct model is the most successful at designing a stimulus to produce the desired behavior. Using these stimuli we were able to distinguish between models with subtle mechanistic differences or where input and outputs were multiple reactions removed from the model differences. An advantage of this method of model discrimination is that it does not require novel reagents, or altered measurement techniques; the only change to the experiment is the time course of stimulation. Taken together, these results provide a strong basis for using designed input stimuli as a tool for the development of cell signaling models. PMID

  7. Designers Workbench: Towards Real-Time Immersive Modeling

    SciTech Connect

    Kuester, F; Duchaineau, M A; Hamann, B; Joy, K I; Ma, K L

    2001-10-03

    This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technology or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  8. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  9. On Optimal Input Design and Model Selection for Communication Channels

    SciTech Connect

    Li, Yanyan; Djouadi, Seddik M; Olama, Mohammed M

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  10. Using Storyboards to Integrate Models and Informal Design Knowledge

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; van den Bergh, Jan; Meskens, Jan; Luyten, Kris; Degrandsart, Sylvain; Demeyer, Serge; Coninx, Karin

    Model-driven development of user interfaces has become increasingly powerful in recent years. Unfortunately, model-driven approaches have the inherent limitation that they cannot handle the informal nature of some of the artifacts used in truly multidisciplinary user interface development such as storyboards, sketches, scenarios and personas. In this chapter, we present an approach and tool support for multidisciplinary user interface development bridging informal and formal artifacts in the design and development process. Key features of the approach are the usage of annotated storyboards, which can be connected to other models through an underlying meta-model, and cross-toolkit design support based on an abstract user interface model.

  11. Design Approaches to Support Preservice Teachers in Scientific Modeling

    NASA Astrophysics Data System (ADS)

    Kenyon, Lisa; Davis, Elizabeth A.; Hug, Barbara

    2011-02-01

    Engaging children in scientific practices is hard for beginning teachers. One such scientific practice with which beginning teachers may have limited experience is scientific modeling. We have iteratively designed preservice teacher learning experiences and materials intended to help teachers achieve learning goals associated with scientific modeling. Our work has taken place across multiple years at three university sites, with preservice teachers focused on early childhood, elementary, and middle school teaching. Based on results from our empirical studies supporting these design decisions, we discuss design features of our modeling instruction in each iteration. Our results suggest some successes in supporting preservice teachers in engaging students in modeling practice. We propose design principles that can guide science teacher educators in incorporating modeling in teacher education.

  12. Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.

    ERIC Educational Resources Information Center

    Baker, Richard

    A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…

  13. Modeling Programs Increase Aircraft Design Safety

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Flutter may sound like a benign word when associated with a flag in a breeze, a butterfly, or seaweed in an ocean current. When used in the context of aerodynamics, however, it describes a highly dangerous, potentially deadly condition. Consider the case of the Lockheed L-188 Electra Turboprop, an airliner that first took to the skies in 1957. Two years later, an Electra plummeted to the ground en route from Houston to Dallas. Within another year, a second Electra crashed. In both cases, all crew and passengers died. Lockheed engineers were at a loss as to why the planes wings were tearing off in midair. For an answer, the company turned to NASA s Transonic Dynamics Tunnel (TDT) at Langley Research Center. At the time, the newly renovated wind tunnel offered engineers the capability of testing aeroelastic qualities in aircraft flying at transonic speeds near or just below the speed of sound. (Aeroelasticity is the interaction between aerodynamic forces and the structural dynamics of an aircraft or other structure.) Through round-the-clock testing in the TDT, NASA and industry researchers discovered the cause: flutter. Flutter occurs when aerodynamic forces acting on a wing cause it to vibrate. As the aircraft moves faster, certain conditions can cause that vibration to multiply and feed off itself, building to greater amplitudes until the flutter causes severe damage or even the destruction of the aircraft. Flutter can impact other structures as well. Famous film footage of the Tacoma Narrows Bridge in Washington in 1940 shows the main span of the bridge collapsing after strong winds generated powerful flutter forces. In the Electra s case, faulty engine mounts allowed a type of flutter known as whirl flutter, generated by the spinning propellers, to transfer to the wings, causing them to vibrate violently enough to tear off. Thanks to the NASA testing, Lockheed was able to correct the Electra s design flaws that led to the flutter conditions and return the

  14. A Model-Based Expert System For Digital Systems Design

    NASA Astrophysics Data System (ADS)

    Wu, J. G.; Ho, W. P. C.; Hu, Y. H.; Yun, D. Y. Y.; Parng, T. M.

    1987-05-01

    In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.

  15. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  16. Hotspot detection and design recommendation using silicon calibrated CMP model

    NASA Astrophysics Data System (ADS)

    Hui, Colin; Wang, Xian Bin; Huang, Haigou; Katakamsetty, Ushasree; Economikos, Laertis; Fayaz, Mohammed; Greco, Stephen; Hua, Xiang; Jayathi, Subramanian; Yuan, Chi-Min; Li, Song; Mehrotra, Vikas; Chen, Kuang Han; Gbondo-Tugbawa, Tamba; Smith, Taber

    2009-03-01

    Chemical Mechanical Polishing (CMP) has been used in the manufacturing process for copper (Cu) damascene process. It is well known that dishing and erosion occur during CMP process, and they strongly depend on metal density and line width. The inherent thickness and topography variations become an increasing concern for today's designs running through advanced process nodes (sub 65nm). Excessive thickness and topography variations can have major impacts on chip yield and performance; as such they need to be accounted for during the design stage. In this paper, we will demonstrate an accurate physics based CMP model and its application for CMP-related hotspot detection. Model based checking capability is most useful to identify highly environment sensitive layouts that are prone to early process window limitation and hence failure. Model based checking as opposed to rule based checking can identify more accurately the weak points in a design and enable designers to provide improved layout for the areas with highest leverage for manufacturability improvement. Further, CMP modeling has the ability to provide information on interlevel effects such as copper puddling from underlying topography that cannot be captured in Design-for- Manufacturing (DfM) recommended rules. The model has been calibrated against the silicon produced with the 45nm process from Common Platform (IBMChartered- Samsung) technology. It is one of the earliest 45nm CMP models available today. We will show that the CMP-related hotspots can often occur around the spaces between analog macros and digital blocks in the SoC designs. With the help of the CMP model-based prediction, the design, the dummy fill or the placement of the blocks can be modified to improve planarity and eliminate CMP-related hotspots. The CMP model can be used to pass design recommendations to designers to improve chip yield and performance.

  17. Multiscale Modeling in the Clinic: Drug Design and Development.

    PubMed

    Clancy, Colleen E; An, Gary; Cannon, William R; Liu, Yaling; May, Elebeoba E; Ortoleva, Peter; Popel, Aleksander S; Sluka, James P; Su, Jing; Vicini, Paolo; Zhou, Xiaobo; Eckmann, David M

    2016-09-01

    A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multiscale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multiscale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions to guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multiscale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical and computational techniques employed for multiscale modeling approaches used in pharmacometric and systems pharmacology models in drug development and present several examples illustrating the current state-of-the-art models for (1) excitable systems and applications in cardiac disease; (2) stem cell driven complex biosystems; (3) nanoparticle delivery, with applications to angiogenesis and cancer therapy; (4) host-pathogen interactions and their use in metabolic disorders, inflammation and sepsis; and (5) computer-aided design of nanomedical systems. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multiscale models. PMID:26885640

  18. Multiscale Modeling in the Clinic: Drug Design and Development.

    PubMed

    Clancy, Colleen E; An, Gary; Cannon, William R; Liu, Yaling; May, Elebeoba E; Ortoleva, Peter; Popel, Aleksander S; Sluka, James P; Su, Jing; Vicini, Paolo; Zhou, Xiaobo; Eckmann, David M

    2016-09-01

    A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multiscale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multiscale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions to guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multiscale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical and computational techniques employed for multiscale modeling approaches used in pharmacometric and systems pharmacology models in drug development and present several examples illustrating the current state-of-the-art models for (1) excitable systems and applications in cardiac disease; (2) stem cell driven complex biosystems; (3) nanoparticle delivery, with applications to angiogenesis and cancer therapy; (4) host-pathogen interactions and their use in metabolic disorders, inflammation and sepsis; and (5) computer-aided design of nanomedical systems. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multiscale models.

  19. A Bayesian A-optimal and model robust design criterion.

    PubMed

    Zhou, Xiaojie; Joseph, Lawrence; Wolfson, David B; Bélisle, Patrick

    2003-12-01

    Suppose that the true model underlying a set of data is one of a finite set of candidate models, and that parameter estimation for this model is of primary interest. With this goal, optimal design must depend on a loss function across all possible models. A common method that accounts for model uncertainty is to average the loss over all models; this is the basis of what is known as Läuter's criterion. We generalize Läuter's criterion and show that it can be placed in a Bayesian decision theoretic framework, by extending the definition of Bayesian A-optimality. We use this generalized A-optimality to find optimal design points in an environmental safety setting. In estimating the smallest detectable trace limit in a water contamination problem, we obtain optimal designs that are quite different from those suggested by standard A-optimality.

  20. Experimental Design and Multiplexed Modeling Using Titrimetry and Spreadsheets

    NASA Astrophysics Data System (ADS)

    Harrington, Peter De B.; Kolbrich, Erin; Cline, Jennifer

    2002-07-01

    The topics of experimental design and modeling are important for inclusion in the undergraduate curriculum. Many general chemistry and quantitative analysis courses introduce students to spreadsheet programs, such as MS Excel. Students in the laboratory sections of these courses use titrimetry as a quantitative measurement method. Unfortunately, the only model that students may be exposed to in introductory chemistry courses is the working curve that uses the linear model. A novel experiment based on a multiplex model has been devised for titrating several vinegar samples at a time. The multiplex titration can be applied to many other routine determinations. An experimental design model is fit to titrimetric measurements using the MS Excel LINEST function to estimate concentration from each sample. This experiment provides valuable lessons in error analysis, Class A glassware tolerances, experimental simulation, statistics, modeling, and experimental design.

  1. Modeling Real-Time Applications with Reusable Design Patterns

    NASA Astrophysics Data System (ADS)

    Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik

    Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.

  2. Honors biomedical instrumentation--a course model for accelerated design.

    PubMed

    Madhok, Jai; Smith, Ryan J; Thakor, Nitish V

    2009-01-01

    A model for a 16-week Biomedical Instrumentation course is outlined. The course is modeled in such a way that students learn about medical devices and instrumentation through lecture and laboratory sessions while also learning basic design principles. Course material covers a broad range of topics from fundamentals of sensors and instrumentation, guided laboratory design experiments, design projects, and eventual protection of intellectual property, regulatory considerations, and entry into the commercial market. Students eventually complete two design projects in the form of a 'Challenge' design project as well as an 'Honors' design project. Sample problems students solve during the Challenge project and examples of past Honors projects from the course are highlighted. PMID:19964766

  3. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  4. Designers' models of the human-computer interface

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Breedin, Sarah D.

    1993-01-01

    Understanding design models of the human-computer interface (HCI) may produce two types of benefits. First, interface development often requires input from two different types of experts: human factors specialists and software developers. Given the differences in their backgrounds and roles, human factors specialists and software developers may have different cognitive models of the HCI. Yet, they have to communicate about the interface as part of the design process. If they have different models, their interactions are likely to involve a certain amount of miscommunication. Second, the design process in general is likely to be guided by designers' cognitive models of the HCI, as well as by their knowledge of the user, tasks, and system. Designers do not start with a blank slate; rather they begin with a general model of the object they are designing. The author's approach to a design model of the HCI was to have three groups make judgments of categorical similarity about the components of an interface: human factors specialists with HCI design experience, software developers with HCI design experience, and a baseline group of computer users with no experience in HCI design. The components of the user interface included both display components such as windows, text, and graphics, and user interaction concepts, such as command language, editing, and help. The judgments of the three groups were analyzed using hierarchical cluster analysis and Pathfinder. These methods indicated, respectively, how the groups categorized the concepts, and network representations of the concepts for each group. The Pathfinder analysis provides greater information about local, pairwise relations among concepts, whereas the cluster analysis shows global, categorical relations to a greater extent.

  5. 11. MOVABLE BED SEDIMENTATION MODELS. AUTOMATIC SEDIMENT FEEDER DESIGNED AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. MOVABLE BED SEDIMENTATION MODELS. AUTOMATIC SEDIMENT FEEDER DESIGNED AND BUILT BY WES. - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS

  6. Modeling and observer design for recombinant Escherichia coli strain.

    PubMed

    Nadri, M; Trezzani, I; Hammouri, H; Dhurjati, P; Longin, R; Lieto, J

    2006-03-01

    A mathematical model for recombinant bacteria which includes foreign protein production is developed. The experimental system consists of an Escherichia Coli strain and plasmid pIT34 containing genes for bioluminescence and production of a protein, beta-galactosidase. This recombinant strain is constructed to facilitate on-line estimation and control in a complex bioprocess. Several batch experiments are designed and performed to validate the developed model. The design of a model structure, the identification of the model parameters and the estimation problem are three parts of a joint design problem. A nonlinear observer is designed and an experimental evaluation is performed on a batch fermentation process to estimate the substrate consumption. PMID:16411071

  7. Radiation Belt Modeling for Spacecraft Design: Model Comparisons for Common Orbits

    NASA Technical Reports Server (NTRS)

    Lauenstein, J.-M.; Barth, J. L.

    2005-01-01

    We present the current status of radiation belt modeling, providing model details and comparisons with AP-8 and AE-8 for commonly used orbits. Improved modeling of the particle environment enables smarter space system design.

  8. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  9. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  10. Organizational Learning and Product Design Management: Towards a Theoretical Model.

    ERIC Educational Resources Information Center

    Chiva-Gomez, Ricardo; Camison-Zornoza, Cesar; Lapiedra-Alcami, Rafael

    2003-01-01

    Case studies of four Spanish ceramics companies were used to construct a theoretical model of 14 factors essential to organizational learning. One set of factors is related to the conceptual-analytical phase of the product design process and the other to the creative-technical phase. All factors contributed to efficient product design management…

  11. [Design principles of the POLIMAG-01 basic model].

    PubMed

    Kabishev, V N; Kadyrkov, A P; Makarov, V V; Mikheev, A A; Panin, N I; Solomakha, V N

    2007-01-01

    The design principles and functional capabilities of the basic model of the POLIMAG-01 magnetotherapy apparatus are described. Technical characteristics of the apparatus are specified. Original engineering solutions involved in the apparatus design provide local, distributed, and general magnetotherapy. PMID:18277402

  12. Design of spatial experiments: Model fitting and prediction

    SciTech Connect

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  13. Groundwater modeling and remedial optimization design using graphical user interfaces

    SciTech Connect

    Deschaine, L.M.

    1997-05-01

    The ability to accurately predict the behavior of chemicals in groundwater systems under natural flow circumstances or remedial screening and design conditions is the cornerstone to the environmental industry. The ability to do this efficiently and effectively communicate the information to the client and regulators is what differentiates effective consultants from ineffective consultants. Recent advances in groundwater modeling graphical user interfaces (GUIs) are doing for numerical modeling what Windows{trademark} did for DOS{trademark}. GUI facilitates both the modeling process and the information exchange. This Test Drive evaluates the performance of two GUIs--Groundwater Vistas and ModIME--on an actual groundwater model calibration and remedial design optimization project. In the early days of numerical modeling, data input consisted of large arrays of numbers that required intensive labor to input and troubleshoot. Model calibration was also manual, as was interpreting the reams of computer output for each of the tens or hundreds of simulations required to calibrate and perform optimal groundwater remedial design. During this period, the majority of the modelers effort (and budget) was spent just getting the model running, as opposed to solving the environmental challenge at hand. GUIs take the majority of the grunt work out of the modeling process, thereby allowing the modeler to focus on designing optimal solutions.

  14. Control system design for flexible structures using data models

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Frazier, W. Garth; Mitchell, Jerrel R.; Medina, Enrique A.; Bukley, Angelia P.

    1993-01-01

    The dynamics and control of flexible aerospace structures exercises many of the engineering disciplines. In recent years there has been considerable research in the developing and tailoring of control system design techniques for these structures. This problem involves designing a control system for a multi-input, multi-output (MIMO) system that satisfies various performance criteria, such as vibration suppression, disturbance and noise rejection, attitude control and slewing control. Considerable progress has been made and demonstrated in control system design techniques for these structures. The key to designing control systems for these structures that meet stringent performance requirements is an accurate model. It has become apparent that theoretically and finite-element generated models do not provide the needed accuracy; almost all successful demonstrations of control system design techniques have involved using test results for fine-tuning a model or for extracting a model using system ID techniques. This paper describes past and ongoing efforts at Ohio University and NASA MSFC to design controllers using 'data models.' The basic philosophy of this approach is to start with a stabilizing controller and frequency response data that describes the plant; then, iteratively vary the free parameters of the controller so that performance measures become closer to satisfying design specifications. The frequency response data can be either experimentally derived or analytically derived. One 'design-with-data' algorithm presented in this paper is called the Compensator Improvement Program (CIP). The current CIP designs controllers for MIMO systems so that classical gain, phase, and attenuation margins are achieved. The center-piece of the CIP algorithm is the constraint improvement technique which is used to calculate a parameter change vector that guarantees an improvement in all unsatisfied, feasible performance metrics from iteration to iteration. The paper also

  15. Creativity in the Training and Practice of Instructional Designers: The Design/Creativity Loops Model

    ERIC Educational Resources Information Center

    Clinton, Gregory; Hokanson, Brad

    2012-01-01

    This article presents a discussion of research and theoretical perspectives on creativity and instructional design, offering a conceptual model of the connection between these two constructs that was originally proposed in the dissertation work of the first author (Clinton, Creativity and design: A study of the learning experience of instructional…

  16. Ohio River navigation investment model: Requirements and model design

    SciTech Connect

    Bronzini, M.S.; Curlee, T.R.; Leiby, P.N.; Southworth, F.; Summers, M.S.

    1998-01-01

    Oak Ridge National Laboratory is assisting the US Army Corps of Engineers in improving its economic analysis procedures for evaluation of inland waterway investment projects along the Ohio River System. This paper describes the context and design of an integrated approach to calculating the system-wide benefits from alternative combinations of lock and channel improvements, providing an ability to project the cost savings from proposed waterway improvements in capacity and reliability for up to fifty years into the future. The design contains an in-depth treatment of the levels of risk and uncertainty associated with different multi-year lock and channel improvement plans, including the uncertainty that results from a high degree of interaction between the many different waterway system components.

  17. A MODEL AND CONTROLLER REDUCTION METHOD FOR ROBUST CONTROL DESIGN.

    SciTech Connect

    YUE,M.; SCHLUETER,R.

    2003-10-20

    A bifurcation subsystem based model and controller reduction approach is presented. Using this approach a robust {micro}-synthesis SVC control is designed for interarea oscillation and voltage control based on a small reduced order bifurcation subsystem model of the full system. The control synthesis problem is posed by structured uncertainty modeling and control configuration formulation using the bifurcation subsystem knowledge of the nature of the interarea oscillation caused by a specific uncertainty parameter. Bifurcation subsystem method plays a key role in this paper because it provides (1) a bifurcation parameter for uncertainty modeling; (2) a criterion to reduce the order of the resulting MSVC control; and (3) a low order model for a bifurcation subsystem based SVC (BMSVC) design. The use of the model of the bifurcation subsystem to produce a low order controller simplifies the control design and reduces the computation efforts so significantly that the robust {micro}-synthesis control can be applied to large system where the computation makes robust control design impractical. The RGA analysis and time simulation show that the reduced BMSVC control design captures the center manifold dynamics and uncertainty structure of the full system model and is capable of stabilizing the full system and achieving satisfactory control performance.

  18. Using the ADDIE model in designing library instruction.

    PubMed

    Reinbold, Sarah

    2013-01-01

    Librarians are increasingly placed in the role of instructor and are required to design and deliver effective instruction. This article highlights ADDIE, an instructional design model that librarians at Weill Cornell Medical College used to redesign an evidence-based medicine course taken by first-year medical students. The ADDIE model incorporates the five phases of analysis, design, development, implementation, and evaluation. It was found that the application of ADDIE can result in instruction that focuses on learning outcomes relevant to students, meets students' needs, and facilitates active learning.

  19. Civil tiltrotor transport point design: Model 940A

    NASA Technical Reports Server (NTRS)

    Rogers, Charles; Reisdorfer, Dale

    1993-01-01

    The objective of this effort is to produce a vehicle layout for the civil tiltrotor wing and center fuselage in sufficient detail to obtain aerodynamic and inertia loads for determining member sizing. This report addresses the parametric configuration and loads definition for a 40 passenger civil tilt rotor transport. A preliminary (point) design is developed for the tiltrotor wing box and center fuselage. This summary report provides all design details used in the pre-design; provides adequate detail to allow a preliminary design finite element model to be developed; and contains guidelines for dynamic constraints.

  20. Applying learning theories and instructional design models for effective instruction.

    PubMed

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.

  1. Evolution of Geometric Sensitivity Derivatives from Computer Aided Design Models

    NASA Technical Reports Server (NTRS)

    Jones, William T.; Lazzara, David; Haimes, Robert

    2010-01-01

    The generation of design parameter sensitivity derivatives is required for gradient-based optimization. Such sensitivity derivatives are elusive at best when working with geometry defined within the solid modeling context of Computer-Aided Design (CAD) systems. Solid modeling CAD systems are often proprietary and always complex, thereby necessitating ad hoc procedures to infer parameter sensitivity. A new perspective is presented that makes direct use of the hierarchical associativity of CAD features to trace their evolution and thereby track design parameter sensitivity. In contrast to ad hoc methods, this method provides a more concise procedure following the model design intent and determining the sensitivity of CAD geometry directly to its respective defining parameters.

  2. A modal test design strategy for model correlation

    SciTech Connect

    Carne, T.G.; Dohrmann, C.R.

    1994-12-01

    When a modal test is to be performed for purposes of correlation with a finite element model, one needs to design the test so that the resulting measurements will provide the data needed for the correlation. There are numerous issues to consider in the design of a modal test; two important ones are the number and location of response sensors, and the number, location, and orientation of input excitation. From a model correlation perspective, one would like to select the response locations to allow a definitive, one-to-one correspondence between the measured modes and the predicted modes. Further, the excitation must be designed to excite all the modes of interest at a sufficiently high level so that the modal estimation algorithms can accurately extract the modal parameters. In this paper these two issues are examined in the context of model correlation with methodologies presented for obtaining an experiment design.

  3. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  4. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single

  5. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  6. Communications network design and costing model programmers manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    Otpimization algorithms and techniques used in the communications network design and costing model for least cost route and least cost network problems are examined from the programmer's point of view. All system program modules, the data structures within the model, and the files which make up the data base are described.

  7. Evaluating Instructional Design Models: A Proposed Research Approach

    ERIC Educational Resources Information Center

    Gropper, George L.

    2015-01-01

    Proliferation of prescriptive models in an "engineering" field is not a sign of its maturity. Quite the opposite. Materials engineering, for example, meets the criterion of parsimony. Sadly, the very large number of models in "instructional design," putatively an engineering field, raises questions about its status. Can the…

  8. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  9. Aspect-Oriented Design with Reusable Aspect Models

    NASA Astrophysics Data System (ADS)

    Kienzle, Jörg; Al Abed, Wisam; Fleurey, Franck; Jézéquel, Jean-Marc; Klein, Jacques

    The idea behind Aspect-Oriented Modeling (AOM) is to apply aspect-oriented techniques to (software) models with the aim of modularizing crosscutting concerns. This can be done within different modeling notations, at different levels of abstraction, and at different moments during the software development process. This paper demonstrates the applicability of AOM during the software design phase by presenting parts of an aspect-oriented design of a crisis management system. The design solution proposed in this paper is based on the Reusable Aspect Models (RAM) approach, which allows a modeler to express the structure and behavior of a complex system using class, state and sequence diagrams encapsulated in several aspect models. The paper describes how the model of the "create mission" functionality of the server backend can be decomposed into 23 inter-dependent aspect models. The presentation of the design is followed by a discussion on the lessons learned from the case study. Next, RAM is compared to 8 other AOM approaches according to 6 criteria: language, concern composition, asymmetric and symmetric composition, maturity, and tool support. To conclude the paper, a discussion section points out the features of RAM that specifically support reuse.

  10. Freshman Interest Groups: Designing a Model for Success

    ERIC Educational Resources Information Center

    Ratliff, Gerald Lee

    2008-01-01

    Freshman Interest Groups (FIGS) have become a popular model for academic and student affairs colleagues who are concerned that first-year students learn to reflect on life experiences and daily events as part of the learning process. A well-designed FIG model meets the academic, social and career concerns for first-year students by providing an…

  11. Towards Comprehensive Variation Models for Designing Vehicle Monitoring Systems

    NASA Technical Reports Server (NTRS)

    McAdams, Daniel A.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes in a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. This crucial roadblock makes their implementation in real vehicles (e.g., helicopter transmissions and aircraft engines) difficult, making their operation costly and unreliable. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. Using such models, we develop a methodology to account for design and manufacturing variations, and explore the changes in the vibration response to determine its stochastic nature. We explore the potential of the methodology using a nonlinear cam-follower model, where the spring stiffness values are assumed to follow a normal distribution. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle monitoring systems.

  12. Models for hydrologic design of evapotranspiration landfill covers.

    PubMed

    Hauser, Victor L; Gimon, Dianna M; Bonta, James V; Howell, Terry A; Malone, Robert W; Williams, Jimmy R

    2005-09-15

    The technology used in landfill covers is changing, and an alternative cover called the evapotranspiration (ET) landfill cover is coming into use. Important design requirements are prescribed by Federal rules and regulations for conventional landfill covers but not for ET landfill covers. There is no accepted hydrologic model for ET landfill cover design. This paper describes ET cover requirements and design issues, and assesses the accuracy of the EPIC and HELP hydrologic models when used for hydrologic design of ET covers. We tested the models against high-quality field measurements available from lysimeters maintained by the Agricultural Research Service of the U.S. Department of Agriculture at Coshocton, Ohio, and Bushland, Texas. The HELP model produced substantial errors in estimating hydrologic variables. The EPIC model estimated ET and deep percolation with errors less than 7% and 5%, respectively, and accurately matched extreme events with an error of less than 2% of precipitation. The EPIC model is suitable for use in hydrologic design of ET landfill covers.

  13. Collocation methods for distillation design. 1: Model description and testing

    SciTech Connect

    Huss, R.S.; Westerberg, A.W.

    1996-05-01

    Fast and accurate distillation design requires a model that significantly reduces the problem size while accurately approximating a full-order distillation column model. This collocation model builds on the concepts of past collocation models for design of complex real-world separation systems. Two variable transformations make this method unique. Polynomials cannot accurately fit trajectories which flatten out. In columns, flat sections occur in the middle of large column sections or where concentrations go to 0 or 1. With an exponential transformation of the tray number which maps zero to an infinite number of trays onto the range 0--1, four collocation trays can accurately simulate a large column section. With a hyperbolic tangent transformation of the mole fractions, the model can simulate columns which reach high purities. Furthermore, this model uses multiple collocation elements for a column section, which is more accurate than a single high-order collocation section.

  14. The influence of model design on comparisons of single point measurements with grid-model predictions

    SciTech Connect

    Dennis, R.L.; Byun, D.W.; Seilkop, S.K.

    1994-12-31

    A principal use of air quality models is to extrapolate air concentrations or deposition to future conditions outside historically observed conditions. We depend on the science in models to provide the best extrapolation. The objective of a model evaluation is to establish as best possible, that the model is not flawed and that it is functioning properly as designed. This is not the same as saying the objective is to achieve uniformly good comparison with data. Discrepancies can arise when model design artifacts result in some sort of bias or that mask a bias. This paper addresses this issue as related to the horizontal design of a regional model.

  15. Future Modeling Needs in Pulse Detonation Rocket Engine Design

    NASA Technical Reports Server (NTRS)

    Meade, Brian; Talley, Doug; Mueller, Donn; Tew, Dave; Guidos, Mike; Seymour, Dave

    2001-01-01

    This paper presents a performance model rocket engine design that takes advantage of pulse detonation to generate thrust. The contents include: 1) Introduction to the Pulse Detonation Rocket Engine (PDRE); 2) PDRE modeling issues and options; 3) Discussion of the PDRE Performance Workshop held at Marshall Space Flight Center; and 4) Identify needs involving an open performance model for Pulse Detonation Rocket Engines. This paper is in viewgraph form.

  16. An aircraft model for the AIAA controls design challenge

    NASA Technical Reports Server (NTRS)

    Brumbaugh, Randal W.

    1991-01-01

    A generic, state-of-the-art, high-performance aircraft model, including detailed, full-envelope, nonlinear aerodynamics, and full-envelope thrust and first-order engine response data is described. While this model was primarily developed Controls Design Challenge, the availability of such a model provides a common focus for research in aeronautical control theory and methodology. An implementation of this model using the FORTRAN computer language, associated routines furnished with the aircraft model, and techniques for interfacing these routines to external procedures is also described. Figures showing vehicle geometry, surfaces, and sign conventions are included.

  17. NREL Wind Integrated System Design and Engineering Model

    SciTech Connect

    Ning, S. Andrew; Scott, George; Graf, Peter

    2013-09-30

    NREL_WISDEM is an integrated model for wind turbines and plants developed In python based on the open source software OpenMDAO. NREL_WISDEM is a set of wrappers for various wind turbine and models that integrate pre-existing models together into OpenMDAO. It is organized into groups each with their own repositories including Plant_CostSE. Plant_EnergySE, Turbine_CostSE and TurbineSE. The wrappers are designed for licensed and non-licensed models though in both cases, one has to have access to and install the individual models themselves before using them in the overall software platform.

  18. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  19. Design of Experiments, Model Calibration and Data Assimilation

    SciTech Connect

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of emulation, calibration and experiment design for computer experiments. Emulation refers to building a statistical surrogate from a carefully selected and limited set of model runs to predict unsampled outputs. The standard kriging approach to emulation of complex computer models is presented. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Markov chain Monte Carlo (MCMC) algorithms are often used to sample the calibrated parameter distribution. Several MCMC algorithms commonly employed in practice are presented, along with a popular diagnostic for evaluating chain behavior. Space-filling approaches to experiment design for selecting model runs to build effective emulators are discussed, including Latin Hypercube Design and extensions based on orthogonal array skeleton designs and imposed symmetry requirements. Optimization criteria that further enforce space-filling, possibly in projections of the input space, are mentioned. Designs to screen for important input variations are summarized and used for variable selection in a nuclear fuels performance application. This is followed by illustration of sequential experiment design strategies for optimization, global prediction, and rare event inference.

  20. Optimal experiment design for model selection in biochemical networks

    PubMed Central

    2014-01-01

    Background Mathematical modeling is often used to formalize hypotheses on how a biochemical network operates by discriminating between competing models. Bayesian model selection offers a way to determine the amount of evidence that data provides to support one model over the other while favoring simple models. In practice, the amount of experimental data is often insufficient to make a clear distinction between competing models. Often one would like to perform a new experiment which would discriminate between competing hypotheses. Results We developed a novel method to perform Optimal Experiment Design to predict which experiments would most effectively allow model selection. A Bayesian approach is applied to infer model parameter distributions. These distributions are sampled and used to simulate from multivariate predictive densities. The method is based on a k-Nearest Neighbor estimate of the Jensen Shannon divergence between the multivariate predictive densities of competing models. Conclusions We show that the method successfully uses predictive differences to enable model selection by applying it to several test cases. Because the design criterion is based on predictive distributions, which can be computed for a wide range of model quantities, the approach is very flexible. The method reveals specific combinations of experiments which improve discriminability even in cases where data is scarce. The proposed approach can be used in conjunction with existing Bayesian methodologies where (approximate) posteriors have been determined, making use of relations that exist within the inferred posteriors. PMID:24555498

  1. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  2. Modelling and design of high performance indium phosphide solar cells

    NASA Technical Reports Server (NTRS)

    Rhoads, Sandra L.; Barnett, Allen M.

    1989-01-01

    A first principles pn junction device model has predicted new designs for high voltage, high efficiency InP solar cells. Measured InP material properties were applied and device parameters (thicknesses and doping) were adjusted to obtain optimal performance designs. Results indicate that p/n InP designs will provide higher voltages and higher energy conversion efficiencies than n/p structures. Improvements to n/p structures for increased efficiency are predicted. These new designs exploit the high absorption capabilities, relatively long diffusion lengths, and modest surface recombination velocities characteristic of InP. Predictions of performance indicate achievable open-circuit voltage values as high as 943 mV for InP and a practical maximum AM0 efficiency of 22.5 percent at 1 sun and 27 C. The details of the model, the optimal InP structure and the effect of individual parameter variations on device performance are presented.

  3. Designing visual displays and system models for safe reactor operations

    SciTech Connect

    Brown-VanHoozer, S.A.

    1995-12-31

    The material presented in this paper is based on two studies involving the design of visual displays and the user`s prospective model of a system. The studies involve a methodology known as Neuro-Linguistic Programming and its use in expanding design choices from the operator`s perspective image. The contents of this paper focuses on the studies and how they are applicable to the safety of operating reactors.

  4. PHARAO laser source flight model: Design and performances

    SciTech Connect

    Lévèque, T. Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P.; Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S.; Laurent, Ph.

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  5. Structural similitude and design of scaled down laminated models

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Rezaeepazhand, J.

    1993-01-01

    The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial

  6. Green space system design in Luoyang using Huff model

    NASA Astrophysics Data System (ADS)

    Wang, Shengnan; Li, Meng

    2008-10-01

    Green space system, as part of the urban ecological environment and urban landscape, plays a significant role in the protection of biological diversity of the urban eco-systems. During the process of rapid modernization in China, it is evident that in order to satisfy the residents' needs of entertainment and communication effectively; there should be abundant types and adequate arrangement of green space. And at the same time a comprehensive and stable hierarchical structure of green space system ought to be established. Huff Model is widely used in facility location planning and service area segmentation in business geography, and has potentials in urban facility planning and design. This paper aims to evaluate, design and optimize the urban green space in Luoyang City, Henan Province, using GIS and Huff Model. Considering the existing location, size and shape of the green space supply, the spatial distribution of residence and the urban transportation systems, the attractiveness between residence and green space is estimated. The spatial pattern and service capability of the green space system are also evaluated critically. Based on the findings, the possible optimization design of the green space system in Luoyang is discussed innovatively. Huff model test shows that the design improves the overall spatial accessibility observably. The case study shows that GIS technology and Huff Model have great potential in urban green space evaluation, planning and design.

  7. An economic model for passive solar designs in commercial environments

    NASA Astrophysics Data System (ADS)

    Powell, J. W.

    1980-06-01

    The model incorporates a life cycle costing approach that focuses on the costs of purchase, installation, maintenance, repairs, replacement, and energy. It includes a detailed analysis of tax laws affecting the use of solar energy in commercial buildings. Possible methods of treating difficult to measure benefits and costs, such as effects of the passive solar design on resale value of the building and on lighting costs, rental income from the building, and the use of commercial space, are presented. The model is illustrated in two case examples of prototypical solar design for low rise commercial buildings in an urban setting.

  8. Design verification and cold-flow modeling test report

    SciTech Connect

    Not Available

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  9. Design driven test patterns for OPC models calibration

    NASA Astrophysics Data System (ADS)

    Al-Imam, Mohamed

    2009-03-01

    In the modern photolithography process for manufacturing integrated circuits, geometry dimensions need to be realized on silicon which are much smaller than the exposure wavelength. Thus Resolution Enhancement Techniques have an indispensable role towards the implementation of a successful technology process node. Finding an appropriate RET recipe, that answers the needs of a certain fabrication process, usually involves intensive computational simulations. These simulations have to reflect how different elements in the lithography process under study will behave. In order to achieve this, accurate models are needed that truly represent the transmission of patterns from mask to silicon. A common practice in calibrating lithography models is to collect data for the dimensions of some test structures created on the exposure mask along with the corresponding dimensions of these test structures on silicon after exposure. This data is used to tune the models for good predictions. The models will be guaranteed to accurately predict the test structures that has been used in its tuning. However, real designs might have a much greater variety of structures that might not have been included in the test structures. This paper explores a method for compiling the test structures to be used in the model calibration process using design layouts as an input. The method relies on reducing structures in the design layout to the essential unique structure from the lithography models point of view, and thus ensuring that the test structures represent what the model would actually have to predict during the simulations.

  10. Software Engineering Designs for Super-Modeling Different Versions of CESM Models using DART

    NASA Astrophysics Data System (ADS)

    Kluzek, E. B.; Duane, G. S.; Tribbia, J. J.; Vertenstein, M.

    2013-12-01

    The super-modeling approach connects different models together at run time in order to provide run time feedbacks between the models and therefore synchronize the models together. This method thus reduces model bias further than after-the-fact averaging of model output. We explore different designs to connect different configurations and versions of the IPCC class climate model the Community Earth System Model (CESM) together. We use the Data Assimilation Research Test-bed (DART) software to provide data assimilation as well as provide a software framework to link different model configurations together. We also show results from some simple experiments that demonstrate the ability to synchronize different model versions together.

  11. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design.

  12. Electricity Market Manipulation: How Behavioral Modeling Can Help Market Design

    SciTech Connect

    Gallo, Giulia

    2015-12-18

    The question of how to best design electricity markets to integrate variable and uncertain renewable energy resources is becoming increasingly important as more renewable energy is added to electric power systems. Current markets were designed based on a set of assumptions that are not always valid in scenarios of high penetrations of renewables. In a future where renewables might have a larger impact on market mechanisms as well as financial outcomes, there is a need for modeling tools and power system modeling software that can provide policy makers and industry actors with more realistic representations of wholesale markets. One option includes using agent-based modeling frameworks. This paper discusses how key elements of current and future wholesale power markets can be modeled using an agent-based approach and how this approach may become a useful paradigm that researchers can employ when studying and planning for power systems of the future.

  13. Designing Multi-target Compound Libraries with Gaussian Process Models.

    PubMed

    Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert

    2016-05-01

    We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. PMID:27492085

  14. A model for designing functionally gradient material joints

    SciTech Connect

    Messler, R.W. Jr.; Jou, M.; Orling, T.T.

    1995-05-01

    An analytical, thin-plate layer model was developed to assist research and development engineers in the design of functionally gradient material (FGM) joints consisting of discrete steps between end elements of dissimilar materials. Such joints have long been produced by diffusion bonding using intermediates or multiple interlayers; welding, brazing or soldering using multiple transition pieces; and glass-to-glass or glass-to-metal bonding using multiple layers to produce matched seals. More recently, FGM joints produced by self-propagating high-temperature synthesis (SHS) are attracting the attention of researchers. The model calculates temperature distributions and associated thermally induced stresses, assuming elastic behavior, for any number of layers of any thickness or composition, accounting for critically important thermophysical properties in each layer as functions of temperature. It is useful for assuring that cured-in fabrication stresses from thermal expansion mismatches will not prevent quality joint production. The model`s utility is demonstrated with general design cases.

  15. Model-It: A Case Study of Learner-Centered Software Design for Supporting Model Building.

    ERIC Educational Resources Information Center

    Jackson, Shari L.; Stratford, Steven J.; Krajcik, Joseph S.; Soloway, Elliot

    Learner-centered software design (LCSD) guides the design of tasks, tools, and interfaces in order to support the unique needs of learners: growth, diversity and motivation. This paper presents a framework for LCSD and describes a case study of its application to the ScienceWare Model-It, a learner-centered tool to support scientific modeling and…

  16. Design and Development of Physics Module Based on Learning Style and Appropriate Technology by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah

    2012-01-01

    The study was aimed at designing and developing a Physics module based on learning style and appropriate technology in secondary educational setting by employing Isman Instructional Design Model and to test the effectiveness of the module. The paper draws attention to the design principles which employs Isman Instructional Design Model. The…

  17. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non

  18. Planetary gear profile modification design based on load sharing modelling

    NASA Astrophysics Data System (ADS)

    Iglesias, Miguel; Fernández Del Rincón, Alfonso; De-Juan, Ana Magdalena; Garcia, Pablo; Diez, Alberto; Viadero, Fernando

    2015-07-01

    In order to satisfy the increasing demand on high performance planetary transmissions, an important line of research is focused on the understanding of some of the underlying phenomena involved in this mechanical system. Through the development of models capable of reproduce the system behavior, research in this area contributes to improve gear transmission insight, helping developing better maintenance practices and more efficient design processes. A planetary gear model used for the design of profile modifications ratio based on the levelling of the load sharing ratio is presented. The gear profile geometry definition, following a vectorial approach that mimics the real cutting process of gears, is thoroughly described. Teeth undercutting and hypotrochoid definition are implicitly considered, and a procedure for the incorporation of a rounding arc at the tooth tip in order to deal with corner contacts is described. A procedure for the modeling of profile deviations is presented, which can be used for the introduction of both manufacturing errors and designed profile modifications. An easy and flexible implementation of the profile deviation within the planetary model is accomplished based on the geometric overlapping. The contact force calculation and dynamic implementation used in the model are also introduced, and parameters from a real transmission for agricultural applications are presented for the application example. A set of reliefs is designed based on the levelling of the load sharing ratio for the example transmission, and finally some other important dynamic factors of the transmission are analyzed to assess the changes in the dynamic behavior with respect to the non-modified case. Thus, the main innovative aspect of the proposed planetary transmission model is the capacity of providing a simulated load sharing ratio which serves as design variable for the calculation of the tooth profile modifications.

  19. Process Cost Modeling for Multi-Disciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  20. Test model designs for advanced refractory ceramic materials

    NASA Technical Reports Server (NTRS)

    Tran, Huy Kim

    1993-01-01

    The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.

  1. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  2. Modeling and Validation of a Propellant Mixer for Controller Design

    NASA Technical Reports Server (NTRS)

    Richter, Hanz; Barbieri, Enrique; Figueroa, Fernando

    2003-01-01

    A mixing chamber used in rocket engine testing at the NASA Stennis Space Center is modelled by a system of two nonlinear ordinary differential equations. The mixer is used to condition the thermodynamic properties of cryogenic liquid propellant by controlled injection of the same substance in the gaseous phase. The three inputs of the mixer are the positions of the valves regulating the liquid and gas flows at the inlets, and the position of the exit valve regulating the flow of conditioned propellant. Mixer operation during a test requires the regulation of its internal pressure, exit mass flow, and exit temperature. A mathematical model is developed to facilitate subsequent controller designs. The model must be simple enough to lend itself to subsequent feedback controller design, yet its accuracy must be tested against real data. For this reason, the model includes function calls to thermodynamic property data. Some structural properties of the resulting model that pertain to controller design, such as uniqueness of the equilibrium point, feedback linearizability and local stability are shown to hold under conditions having direct physical interpretation. The existence of fixed valve positions that attain a desired operating condition is also shown. Validation of the model against real data is likewise provided.

  3. Simulation models and designs for advanced Fischer-Tropsch technology

    SciTech Connect

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for the products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.

  4. Implementing a Community-Driven Cyberinfrastructure Platform for the Paleo- and Rock Magnetic Scientific Fields that Generalizes to Other Geoscience Disciplines

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Jarboe, N.; Koppers, A. A.; Tauxe, L.; Constable, C.

    2013-12-01

    EarthRef.org is a geoscience umbrella website for several databases and data and model repository portals. These portals, unified in the mandate to preserve their respective data and promote scientific collaboration in their fields, are also disparate in their schemata. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the paleo- and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples and relies on a partially strict subsumptive hierarchical data model. The Geochemical Earth Reference Model (http://earthref.org/GERM/) portal focuses on the chemical characterization of the Earth and relies on two data schemata: a repository of peer-reviewed reservoir geochemistry, and a database of partition coefficients for rocks, minerals, and elements. The Seamount Biogeosciences Network (http://earthref.org/SBN/) encourages the collaboration between the diverse disciplines involved in seamount research and includes the Seamount Catalog (http://earthref.org/SC/) of bathymetry and morphology. All of these portals also depend on the EarthRef Reference Database (http://earthref.org/ERR/) for publication reference metadata and the EarthRef Digital Archive (http://earthref.org/ERDA/), a generic repository of data objects and their metadata. The development of the new MagIC Search Interface (http://earthref.org/MagIC/search/) centers on a reusable platform designed to be flexible enough for largely heterogeneous datasets and to scale up to datasets with tens of millions of records. The HTML5 web application and Oracle 11g database residing at the San Diego Supercomputer Center (SDSC) support the online contribution and editing of complex datasets in a spreadsheet environment and the browsing and filtering of these contributions in the context of thousands of other datasets. EarthRef.org is in the process of

  5. Application of Absorption Modeling in Rational Design of Drug Product Under Quality-by-Design Paradigm.

    PubMed

    Kesisoglou, Filippos; Mitra, Amitava

    2015-09-01

    Physiologically based absorption models can be an important tool in understanding product performance and hence implementation of Quality by Design (QbD) in drug product development. In this report, we show several case studies to demonstrate the potential application of absorption modeling in rational design of drug product under the QbD paradigm. The examples include application of absorption modeling—(1) prior to first-in-human studies to guide development of a formulation with minimal sensitivity to higher gastric pH and hence reduced interaction when co-administered with PPIs and/or H2RAs, (2) design of a controlled release formulation with optimal release rate to meet trough plasma concentrations and enable QD dosing, (3) understanding the impact of API particle size distribution on tablet bioavailability and guide formulation design in late-stage development, (4) assess impact of API phase change on product performance to guide specification setting, and (5) investigate the effect of dissolution rate changes on formulation bioperformance and enable appropriate specification setting. These case studies are meant to highlight the utility of physiologically based absorption modeling in gaining a thorough understanding of the product performance and the critical factors impacting performance to drive design of a robust drug product that would deliver the optimal benefit to the patients.

  6. PID controller design for trailer suspension based on linear model

    NASA Astrophysics Data System (ADS)

    Kushairi, S.; Omar, A. R.; Schmidt, R.; Isa, A. A. Mat; Hudha, K.; Azizan, M. A.

    2015-05-01

    A quarter of an active trailer suspension system having the characteristics of a double wishbone type was modeled as a complex multi-body dynamic system in MSC.ADAMS. Due to the complexity of the model, a linearized version is considered in this paper. A model reduction technique is applied to the linear model, resulting in a reduced-order model. Based on this simplified model, a Proportional-Integral-Derivative (PID) controller was designed in MATLAB/Simulink environment; primarily to reduce excessive roll motions and thus improving the ride comfort. Simulation results show that the output signal closely imitates the input signal in multiple cases - demonstrating the effectiveness of the controller.

  7. A simplified model for two phase face seal design

    NASA Technical Reports Server (NTRS)

    Lau, S. Y.; Hughes, W. F.; Basu, P.; Beatty, P. A.

    1990-01-01

    A simplified quasi-isothermal low-leakage laminar model for analyzing the stiffness and the stability characteristics of two-phase face seals with real fluids is developed. Sample calculations with this model for low-leakage operations are compared with calculations for high-leakage operations, performed using the adiabatic turbulent model of Beatty and Hughes (1987). It was found that the seal characteristics predicted using the two extreme models tend to overlap with each other, indicating that the simplified laminar model may be a useful tool for seal design. The effect of coning was investigated using the simplified model. The results show that, for the same balance, a coned seal has a higher leakage rate than a parallel face seal.

  8. Model Design for Water Wheel Control System of Heumgyeonggaknu

    NASA Astrophysics Data System (ADS)

    Kim, Sang Hyuk; Ham, Seon Young; Lee, Yong Sam

    2016-03-01

    Heumgyeonggaknu (????) is powered by a water-hammering-type water wheel. The technique that maintains the constant speed of the water wheel is assumed to be the one used in the Cheonhyeong (???) apparatus in Shui Yun Yi Xiang Tai (???) made by the Northern Song (??) dynasty in the 11th century. We investigated the history of the development and characteristics of the Cheonhyeong apparatus, and we analyzed ways to transmit the power of Heumgyeonggaknu. In addition, we carried out a conceptual design to systematically examine the power control system. Based on the conceptual design, we built a model for a water wheel control system that could be used in experiments by drawing a 3D model and a basic design.

  9. Design, modeling and control for a stratospheric telecommunication platform

    NASA Astrophysics Data System (ADS)

    Yang, Yueneng; Wu, Jie; Zheng, Wei

    2012-11-01

    The stratospheric airship provides a unique and promising platform for broad band telecommunication relay missions, which combine the advantages of both terrestrial and satellite communication. In this paper, the conceptual design, dynamics modeling and attitude control of the stratospheric telecommunication platform are presented. First, the stratospheric telecommunication platform is introduced, including conceptual design, configuration, energy sources, propeller and payload. Second, dynamics model of the platform is derived form the Newton-Euler formulation and the station-keeping attitude control problem is formulated. Then, the adaptive fuzzy sliding mode control scheme is proposed to develop the attitude-tracking control system, with a particular focus on parametric uncertainties and external disturbances. Finally, simulation results verify the effectiveness and robustness of the proposed control scheme. The proposed conceptual design and control scheme provide a promising approach for telecommunication relay missions using the stratospheric station-keeping airship.

  10. Conflicts Management Model in School: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…

  11. Universal Instructional Design as a Model for Educational Programs

    ERIC Educational Resources Information Center

    Higbee, Jeanne L.

    2007-01-01

    This article describes Universal Instructional Design as an inclusive pedagogical model for use in educational programs, whether provided by traditional educational institutions, community-based initiatives, or workplace literacy projects. For the benefit of public relations specialists and classroom educators alike, the article begins with a…

  12. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  13. Designing a Programming-Based Approach for Modelling Scientific Phenomena

    ERIC Educational Resources Information Center

    Simpson, Gordon; Hoyles, Celia; Noss, Richard

    2005-01-01

    We describe an iteratively designed sequence of activities involving the modelling of one-dimensional collisions between moving objects based on programming in ToonTalk. Students aged 13-14 years in two settings (London and Cyprus) investigated a number of collision situations, classified into six classes based on the relative velocities and…

  14. Instructional Design Models and the Representation of Knowledge and Skills.

    ERIC Educational Resources Information Center

    Dijkstra, Sanne

    1991-01-01

    Discussion of learning how to solve problems focuses on instructional design models and how students construct their knowledge and learn skills. Topics discussed include conceptual knowledge; procedural knowledge and skills; knowledge representation; causal knowledge; types of knowledge and relevant skills; knowledge acquisition; prediction and…

  15. Model assessment of protective barrier designs: Part 2

    SciTech Connect

    Fayer, M.J.

    1987-11-01

    Protective barriers are being considered for use at the Hanford Site to enhance the isolation of radioactive wastes from water, plant, and animal intrusion. This study assesses the effectiveness of protective barriers for isolation of wastes from water. In this report, barrier designs are reviewed and several barrier modeling assumptions are tested. 20 refs., 16 figs., 6 tabs.

  16. P-hub protection models for survivable hub network design

    NASA Astrophysics Data System (ADS)

    Kim, Hyun

    2012-10-01

    The design of survivable networks has been a significant issue in network-based infrastructure in transportation, electric power systems, and telecommunications. In telecommunications networks, hubs and backbones are the most critical assets to be protected from any network failure because many network flows use these facilities, resulting in an intensive concentration of flows at these facilities. This paper addresses a series of new hub and spoke network models as survivable network designs, which are termed p- hub protection models (PHPRO). The PHPRO aim to build networks that maximize the total potential interacting traffic over a set of origin-destination nodes based on different routing assumptions, including multiple assignments and back-up hub routes with distance restrictions. Empirical analyses are presented using telecommunication networks in the United States, and the vulnerabilities of networks based on possible disruption scenarios are examined. The results reveal that PROBA, the model with a back-up routing scheme, considerably enhances the network resilience and even the network performance, indicating that the model is a candidate for a strong survivable hub network design. An extension, PROBA-D, also shows that applying a distance restriction can be strategically used for designing back-up hub routes if a network can trade off between network performance and network cost, which results from the reduced length of back-up routings.

  17. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  18. Computational protein design is a challenge for implicit solvation models.

    PubMed

    Jaramillo, Alfonso; Wodak, Shoshana J

    2005-01-01

    Increasingly complex schemes for representing solvent effects in an implicit fashion are being used in computational analyses of biological macromolecules. These schemes speed up the calculations by orders of magnitude and are assumed to compromise little on essential features of the solvation phenomenon. In this work we examine this assumption. Five implicit solvation models, a surface area-based empirical model, two models that approximate the generalized Born treatment and a finite difference Poisson-Boltzmann method are challenged in situations differing from those where these models were calibrated. These situations are encountered in automatic protein design procedures, whose job is to select sequences, which stabilize a given protein 3D structure, from a large number of alternatives. To this end we evaluate the energetic cost of burying amino acids in thousands of environments with different solvent exposures belonging, respectively, to decoys built with random sequences and to native protein crystal structures. In addition we perform actual sequence design calculations. Except for the crudest surface area-based procedure, all the tested models tend to favor the burial of polar amino acids in the protein interior over nonpolar ones, a behavior that leads to poor performance in protein design calculations. We show, on the other hand, that three of the examined models are nonetheless capable of discriminating between the native fold and many nonnative alternatives, a test commonly used to validate force fields. It is concluded that protein design is a particularly challenging test for implicit solvation models because it requires accurate estimates of the solvation contribution of individual residues. This contrasts with native recognition, which depends less on solvation and more on other nonbonded contributions.

  19. Transparent composite model for DCT coefficients: design and analysis.

    PubMed

    Yang, En-Hui; Yu, Xiang; Meng, Jin; Sun, Chang

    2014-03-01

    The distributions of discrete cosine transform (DCT) coefficients of images are revisited on a per image base. To better handle, the heavy tail phenomenon commonly seen in the DCT coefficients, a new model dubbed a transparent composite model (TCM) is proposed and justified for both modeling accuracy and an additional data reduction capability. Given a sequence of the DCT coefficients, a TCM first separates the tail from the main body of the sequence. Then, a uniform distribution is used to model the DCT coefficients in the heavy tail, whereas a different parametric distribution is used to model data in the main body. The separate boundary and other parameters of the TCM can be estimated via maximum likelihood estimation. Efficient online algorithms are proposed for parameter estimation and their convergence is also proved. Experimental results based on Kullback-Leibler divergence and χ(2) test show that for real-valued continuous ac coefficients, the TCM based on truncated Laplacian offers the best tradeoff between modeling accuracy and complexity. For discrete or integer DCT coefficients, the discrete TCM based on truncated geometric distributions (GMTCM) models the ac coefficients more accurately than pure Laplacian models and generalized Gaussian models in majority cases while having simplicity and practicality similar to those of pure Laplacian models. In addition, it is demonstrated that the GMTCM also exhibits a good capability of data reduction or feature extraction-the DCT coefficients in the heavy tail identified by the GMTCM are truly outliers, and these outliers represent an outlier image revealing some unique global features of the image. Overall, the modeling performance and the data reduction feature of the GMTCM make it a desirable choice for modeling discrete or integer DCT coefficients in the real-world image or video applications, as summarized in a few of our further studies on quantization design, entropy coding design, and image understanding

  20. V/STOL model fan stage rig design report

    NASA Technical Reports Server (NTRS)

    Cheatham, J. G.; Creason, T. L.

    1983-01-01

    A model single-stage fan with variable inlet guide vanes (VIGV) was designed to demonstrate efficient point operation while providing flow and pressure ratio modulation capability required for a V/STOL propulsion system. The fan stage incorporates a split-flap VIGV with an independently actuated ID flap to permit independent modulation of fan and core engine airstreams, a flow splitter integrally designed into the blade and vanes to completely segregate fan and core airstreams in order to maximize core stream supercharging for V/STOL operation, and an EGV with a variable leading edge fan flap for rig performance optimization. The stage was designed for a maximum flow size of 37.4 kg/s (82.3 lb/s) for compatibility with LeRC test facility requirements. Design values at maximum flow for blade tip velocity and stage pressure ratio are 472 m/s (1550 ft/s) and 1.68, respectively.

  1. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  2. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  3. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  4. Effect of attenuation models on communication system design

    NASA Technical Reports Server (NTRS)

    Shimabukuro, Fred I.

    1995-01-01

    The atmosphere has a significant impact on the design of a global communication system operating at 20 GHz. The system under consideration has a total atmospheric link attenuation budget that is less than 6 dB. For this relatively small link margin, rain, cloud, and molecular attenuation have to be taken into account. For an assessment of system performance on a global basis, attenuation models are utilized. There is concern whether current models can adequately describe the atmospheric effects such that a system planner can properly allocate his resources for superior overall system performance. The atmospheric attenuation as predicted by models will be examined.

  5. Design and Implementation of an Experimental Segway Model

    NASA Astrophysics Data System (ADS)

    Younis, Wael; Abdelati, Mohammed

    2009-03-01

    The segway is the first transportation product to stand, balance, and move in the same way we do. It is a truly 21st-century idea. The aim of this research is to study the theory behind building segway vehicles based on the stabilization of an inverted pendulum. An experimental model has been designed and implemented through this study. The model has been tested for its balance by running a Proportional Derivative (PD) algorithm on a microprocessor chip. The model has been identified in order to serve as an educational experimental platform for segways.

  6. Full spatially resolved laser modeling and design using GLOSS

    NASA Astrophysics Data System (ADS)

    Hudock, Jared; Decker, Mark; Koroshetz, John

    2016-05-01

    L-3 ALST has developed a Generalized Laser and Optics Simulation Suite (GLOSS) to quickly and reliably design high performance laser transmitters. GLOSS uses state of the art wave propagation based algorithms to rigorously simulate the dynamics of laser oscillation. Laser pulse energy, pulse width, beam size, beam shape, and divergence are among the many key performances parameters GLOSS models have the capability to predict. The GLOSS modeling methodology will be discussed and examples of its powerful capability will be demonstrated. Model predictions within 10-15% of actual laser performance data from a sample of experimental lasers will also be shown.

  7. Thinking outside ISD: A management model for instructional design

    NASA Astrophysics Data System (ADS)

    Taylor, Tony Dewayne

    The purpose of this study was to examine the effectiveness of an instructional system management-level model proposed by the author designed to orchestrate the efficient development and implementation of customer requested curriculum. The three phases of systems-based model are designed to ensure delivery of high quality and timely instruction are: (1) the assessment and documentation of organizational training requirements; (2) project management control of curriculum development; and (3) the implementation of relevant instruction by competent instructors. This model also provides (4) measurable and quantifiable course evaluation results to justify return on investment and validate its importance with respect to the customer's organizational strategic objectives. The theoretical approach for this study was systems theory-based due to the nature of the instructional systems design model and the systematic design of the management model. The study was accomplished using single-case study application of qualitative style of inquiry as described by Patton (2002). Qualitative inquiry was selected to collect and analyze participant holistic perspective assessment of effectiveness, relevance, and timeliness of the instructional design management model. Participants for this study included five managers, five subject matter experts, and six students assigned to a military organization responsible for the collection of hydrographic data for the U.S. Navy. Triangulation of data sources within the qualitative framework of the study incorporated the three participant groups---managers, SMEs, and students---incorporated multiple views of the course development and implementation to validate the findings and the remove researcher bias. Qualitative coding was accomplished by importing transcribed interviews into Microsoft Excel and sorted using Auto-Filter. The coded interviews indicated effective functionality in the views of the model from each of the three participant groups

  8. Integrated predictive modelling simulations of burning plasma experiment designs

    NASA Astrophysics Data System (ADS)

    Bateman, Glenn; Onjun, Thawatchai; Kritz, Arnold H.

    2003-11-01

    Models for the height of the pedestal at the edge of H-mode plasmas (Onjun T et al 2002 Phys. Plasmas 9 5018) are used together with the Multi-Mode core transport model (Bateman G et al 1998 Phys. Plasmas 5 1793) in the BALDUR integrated predictive modelling code to predict the performance of the ITER (Aymar A et al 2002 Plasma Phys. Control. Fusion 44 519), FIRE (Meade D M et al 2001 Fusion Technol. 39 336), and IGNITOR (Coppi B et al 2001 Nucl. Fusion 41 1253) fusion reactor designs. The simulation protocol used in this paper is tested by comparing predicted temperature and density profiles against experimental data from 33 H-mode discharges in the JET (Rebut P H et al 1985 Nucl. Fusion 25 1011) and DIII-D (Luxon J L et al 1985 Fusion Technol. 8 441) tokamaks. The sensitivities of the predictions are evaluated for the burning plasma experimental designs by using variations of the pedestal temperature model that are one standard deviation above and below the standard model. Simulations of the fusion reactor designs are carried out for scans in which the plasma density and auxiliary heating power are varied.

  9. Finite state aeroelastic model for use in rotor design optimization

    NASA Technical Reports Server (NTRS)

    He, Chengjian; Peters, David A.

    1993-01-01

    In this article, a rotor aeroelastic model based on a newly developed finite state dynamic wake, coupled with blade finite element analysis, is described. The analysis is intended for application in rotor blade design optimization. A coupled simultaneous system of differential equations combining blade structural dynamics and aerodynamics is established in a formulation well-suited for design sensitivity computation. Each blade is assumed to be an elastic beam undergoing flap bending, lead-lag bending, elastic twist, and axial deflections. Aerodynamic loads are computed from unsteady blade element theory where the rotor three-dimensional unsteady wake is described by a generalized dynamic wake model. Correlation of results obtained from the analysis with flight test data is provided to assess model accuracy.

  10. Design and modeling of flower like microring resonator

    NASA Astrophysics Data System (ADS)

    Razaghi, Mohammad; Laleh, Mohammad Sayfi

    2016-05-01

    This paper presents a novel multi-channel optical filter structure. The proposed design is based on using a set of microring resonators (MRRs) in new formation, named flower like arrangement. It is shown that instead of using 18 MRRs, by using only 5 MRRs in recommended formation, same filtering operation can be achieved. It is shown that with this structure, six filters and four integrated demultiplexers (DEMUXs) are obtained. The simplicity, extensibility and compactness of this structure make it usable in wavelength division multiplexing (WDM) networks. Filter's characteristics such as shape factor (SF), free spectral range (FSR) and stopband rejection ratio can be designed by adjusting microrings' radii and coupling coefficients. To model this structure, signal flow graph method (SFG) based on Mason's rule is used. The modeling method is discussed in depth. Furthermore, the accuracy and applicability of this method are verified through examples and comparison with other modeling schemes.

  11. An economic model for evaluating high-speed aircraft designs

    NASA Technical Reports Server (NTRS)

    Vandervelden, Alexander J. M.

    1989-01-01

    A Class 1 method for determining whether further development of a new aircraft design is desirable from all viewpoints is presented. For the manufacturer the model gives an estimate of the total cost of research and development from the preliminary design to the first production aircraft. Using Wright's law of production, one can derive the average cost per aircraft produced for a given break-even number. The model will also provide the airline with a good estimate of the direct and indirect operating costs. From the viewpoint of the passenger, the model proposes a tradeoff between ticket price and cruise speed. Finally all of these viewpoints are combined in a Comparative Aircraft Seat-kilometer Economic Index.

  12. Underwater striling engine design with modified one-dimensional model

    NASA Astrophysics Data System (ADS)

    Li, Daijin; Qin, Kan; Luo, Kai

    2015-05-01

    Stirling engines are regarded as an efficient and promising power system for underwater devices. Currently, many researches on one-dimensional model is used to evaluate thermodynamic performance of Stirling engine, but in which there are still some aspects which cannot be modeled with proper mathematical models such as mechanical loss or auxiliary power. In this paper, a four-cylinder double-acting Stirling engine for Unmanned Underwater Vehicles (UUVs) is discussed. And a one-dimensional model incorporated with empirical equations of mechanical loss and auxiliary power obtained from experiments is derived while referring to the Stirling engine computer model of National Aeronautics and Space Administration (NASA). The P-40 Stirling engine with sufficient testing results from NASA is utilized to validate the accuracy of this one-dimensional model. It shows that the maximum error of output power of theoretical analysis results is less than 18% over testing results, and the maximum error of input power is no more than 9%. Finally, a Stirling engine for UUVs is designed with Schmidt analysis method and the modified one-dimensional model, and the results indicate this designed engine is capable of showing desired output power.

  13. Underwater striling engine design with modified one-dimensional model

    NASA Astrophysics Data System (ADS)

    Li, Daijin; Qin, Kan; Luo, Kai

    2015-09-01

    Stirling engines are regarded as an efficient and promising power system for underwater devices. Currently, many researches on one-dimensional model is used to evaluate thermodynamic performance of Stirling engine, but in which there are still some aspects which cannot be modeled with proper mathematical models such as mechanical loss or auxiliary power. In this paper, a four-cylinder double-acting Stirling engine for Unmanned Underwater Vehicles (UUVs) is discussed. And a one-dimensional model incorporated with empirical equations of mechanical loss and auxiliary power obtained from experiments is derived while referring to the Stirling engine computer model of National Aeronautics and Space Administration (NASA). The P-40 Stirling engine with sufficient testing results from NASA is utilized to validate the accuracy of this one-dimensional model. It shows that the maximum error of output power of theoretical analysis results is less than 18% over testing results, and the maximum error of input power is no more than 9%. Finally, a Stirling engine for UUVs is designed with Schmidt analysis method and the modified one-dimensional model, and the results indicate this designed engine is capable of showing desired output power.

  14. Bayesian experimental design for identification of model propositions and conceptual model uncertainty reduction

    NASA Astrophysics Data System (ADS)

    Pham, Hai V.; Tsai, Frank T.-C.

    2015-09-01

    The lack of hydrogeological data and knowledge often results in different propositions (or alternatives) to represent uncertain model components and creates many candidate groundwater models using the same data. Uncertainty of groundwater head prediction may become unnecessarily high. This study introduces an experimental design to identify propositions in each uncertain model component and decrease the prediction uncertainty by reducing conceptual model uncertainty. A discrimination criterion is developed based on posterior model probability that directly uses data to evaluate model importance. Bayesian model averaging (BMA) is used to predict future observation data. The experimental design aims to find the optimal number and location of future observations and the number of sampling rounds such that the desired discrimination criterion is met. Hierarchical Bayesian model averaging (HBMA) is adopted to assess if highly probable propositions can be identified and the conceptual model uncertainty can be reduced by the experimental design. The experimental design is implemented to a groundwater study in the Baton Rouge area, Louisiana. We design a new groundwater head observation network based on existing USGS observation wells. The sources of uncertainty that create multiple groundwater models are geological architecture, boundary condition, and fault permeability architecture. All possible design solutions are enumerated using a multi-core supercomputer. Several design solutions are found to achieve an 80%-identifiable groundwater model in 5 years by using six or more existing USGS wells. The HBMA result shows that each highly probable proposition can be identified for each uncertain model component once the discrimination criterion is achieved. The variances of groundwater head predictions are significantly decreased by reducing posterior model probabilities of unimportant propositions.

  15. A Simplified Model of ARIS for Optimal Controller Design

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey S.; Hampton, R. David; Kross, Denny (Technical Monitor)

    2001-01-01

    Many space-science experiments require active vibration isolation. Boeing's Active Rack Isolation System (ARIS) isolates experiments at the rack (vs. experiment or sub-experiment) level, with multi e experiments per rack. An ARIS-isolated rack typically employs eight actuators and thirteen umbilicals; the umbilicals provide services such as power, data transmission, and cooling. Hampton, et al., used "Kane's method" to develop an analytical, nonlinear, rigid-body model of ARIS that includes full actuator dynamics (inertias). This model, less the umbilicals, was first implemented for simulation by Beech and Hampton; they developed and tested their model using two commercial-off-the-shelf (COTS) software packages. Rupert, et al., added umbilical-transmitted disturbances to this nonlinear model. Because the nonlinear model, even for the untethered system, is both exceedingly complex and "encapsulated" inside these COTS tools, it is largely inaccessible to ARIS controller designers. This paper shows that ISPR rattle-space constraints and small ARIS actuator masses permit considerable model simplification, without significant loss of fidelity. First, for various loading conditions, comparisons are made between the dynamic responses of the nonlinear model (untethered) and a truth model. Then comparisons are made among nonlinear, linearized, and linearized reduced-mass models. It is concluded that these three models all capture the significant system rigid-body dynamics, with the third being preferred due to its relative simplicity.

  16. Dedicated spectrometers based on diffractive optics: design, modelling and evaluation

    NASA Astrophysics Data System (ADS)

    Løvhaugen, O.; Johansen, I.-R.; Bakke, K. A. H.; Fismen, B. G.; Nicolas, S.

    The described design of diffractive optical elements for low cost IR-spectrometers gives a built-in wavelength reference and allows 'spectral arithmetic' to be implemented in the optical performance of the DOE. The diffractive element combines the function of the lenses and the grating and eliminates the need for alignment of those components in the standard scanned grating spectrometer design. The element gives out a set of foci, each with one spectral component, which are scanned across a detector, thus relaxing the demands for scan angle control. It can thus be regarded as an alternative solution to a beam splitter and band pass filter instrument. Software tools have been designed to ease the adaptation of the design to different applications. To model the performance of the spectrometers we have implemented a scalar Rayleigh-Sommerfeldt diffraction model. The gold-coated elements are produced by injection moulding using a compact disc (CD) moulding technique and mould inlays mastered by e-beam lithography. The optimized selection of wavelength bands and the classification of the measured signal use a combination of principal component analysis and robust statistical methods. Typical applications will be material characterization of recycled plastics and gas monitoring. Spectrometers for two different applications have been built and tested. Comparisons between the design goals and the measured performance have been made and show good agreements.

  17. NREL Wind Integrated System Design and Engineering Model

    2013-09-30

    NREL_WISDEM is an integrated model for wind turbines and plants developed In python based on the open source software OpenMDAO. NREL_WISDEM is a set of wrappers for various wind turbine and models that integrate pre-existing models together into OpenMDAO. It is organized into groups each with their own repositories including Plant_CostSE. Plant_EnergySE, Turbine_CostSE and TurbineSE. The wrappers are designed for licensed and non-licensed models though in both cases, one has to have access to andmore » install the individual models themselves before using them in the overall software platform.« less

  18. Digital flight control design using implicit model following.

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.

    1973-01-01

    A design procedure for determining the control gains of a discrete-time ('digital') control system is presented. The method is separable into four distinct steps: (1) the definition of closed-loop response criteria, (2) the choice of a discrete-time model which provides the desired response, (3) the determination of control gains which implicitly force the actual system to follow the desired response, and (4) the reduction of the measurement state by the introduction of an 'observer' (a form of integral-differential compensation). It is shown that a single desired response does not completely define the 'ideal' system. The response criterion generally leaves some parameters of the model unspecified, allowing two courses for improving the model: (1) definition of additional response criteria, or (2) redefinition of the discrete-time model for improved implicit model-following with the actual closed-loop system.

  19. A Design for Composing and Extending Vehicle Models

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.; Neuhaus, Jason R.

    2003-01-01

    The Systems Development Branch (SDB) at NASA Langley Research Center (LaRC) creates simulation software products for research. Each product consists of an aircraft model with experiment extensions. SDB treats its aircraft models as reusable components, upon which experiments can be built. SDB has evolved aircraft model design with the following goals: 1. Avoid polluting the aircraft model with experiment code. 2. Discourage the copy and tailor method of reuse. The current evolution of that architecture accomplishes these goals by reducing experiment creation to extend and compose. The architecture mechanizes the operational concerns of the model's subsystems and encapsulates them in an interface inherited by all subsystems. Generic operational code exercises the subsystems through the shared interface. An experiment is thus defined by the collection of subsystems that it creates ("compose"). Teams can modify the aircraft subsystems for the experiment using inheritance and polymorphism to create variants ("extend").

  20. Design, analysis, and modeling of giant magnetostrictive transducers

    NASA Astrophysics Data System (ADS)

    Calkins, Frederick Theodore

    The increased use of giant magnetostrictive, Terfenol-D transducers in a wide variety of applications has led to a need for greater understanding of the materials performance. This dissertation attempts to add to the Terfenol-D transducer body of knowledge by providing an in-depth analysis and modeling of an experimental transducer. A description of the magnetostriction process related to Terfenol-D includes a discussion of material properties, production methods, and the effect of mechanical stress, magnetization, and temperature on the material performance. The understanding of the Terfenol-D material performance provides the basis for an analysis of the performance of a Terfenol-D transducer. Issues related to the design and utilization of the Terfenol-D material in the transducers are considered, including the magnetic circuit, application of mechanical prestress, and tuning of the mechanical resonance. Experimental results from two broadband, Tonpilz design transducers show the effects of operating conditions (prestress, magnetic bias, AC magnetization amplitude, and frequency) on performance. In an effort to understand and utlilize the rich performance space described by the experimental results a variety of models are considered. An overview of models applicable to Terfenol-D and Terfenol-D transducers is provided, including a discussion of modeling criteria. The Jiles-Atherton model of ferromagnetic hysteresis is employed to describe the quasi-static transducer performance. This model requires the estimation of only six physically-based parameters to accurately simulate performance. The model is shown to be robust with respect to model parameters over a range of mechanical prestress, magnetic biases, and AC magnetic field amplitudes, allowing predictive capability within these ranges. An additional model, based on electroacoustics theory, explains trends in the frequency domain and facilitates an analysis of efficiency based on impedance and admittance

  1. Coherent underwater acoustic communication: Channel modeling and receiver design

    NASA Astrophysics Data System (ADS)

    Song, Aijun; Badiey, Mohsen

    2012-11-01

    High frequency acoustic communication (8-50 kHz) has attracted much attention recently due to increasing scientific and commercial activities in the marine environment. Based on multiple at-sea experiments, we have developed a number of communication techniques to address the receiver design challenges, resulting in significant data rate increases in the dynamic ocean. Our studies also show that it is possible to simulate the acoustic communication channel for its intensity profile, temporal coherence, and Doppler spread, leading to realistic representations of the at-sea measurements. We present our recent results in these two aspects, receiver design and channel modeling, for the mentioned frequency band.

  2. Data warehouse model design technology analysis and research

    NASA Astrophysics Data System (ADS)

    Jiang, Wenhua; Li, Qingshui

    2011-12-01

    Existing data storage format can not meet the needs of information analysis, data warehouse onto the historical stage, the data warehouse is to support business decision making and the creation of specially designed data collection. With the data warehouse, the companies will all collected information is stored in the data warehouse. The data warehouse is organized according to some, making information easy to access and has value. This paper focuses on the establishment of data warehouse and analysis, design, data warehouse, two barrier models, and compares them.

  3. Data warehouse model design technology analysis and research

    NASA Astrophysics Data System (ADS)

    Jiang, Wenhua; Li, Qingshui

    2012-01-01

    Existing data storage format can not meet the needs of information analysis, data warehouse onto the historical stage, the data warehouse is to support business decision making and the creation of specially designed data collection. With the data warehouse, the companies will all collected information is stored in the data warehouse. The data warehouse is organized according to some, making information easy to access and has value. This paper focuses on the establishment of data warehouse and analysis, design, data warehouse, two barrier models, and compares them.

  4. Modeling and simulation role in designing a Teleradiology system.

    PubMed

    Alaoui, Adil; Vasilescu, Eugen; Lindisch, David; Subbiah, Nishant; Mun, Seong K

    2003-01-01

    In designing complex systems, Engineers, Developers and Systems Architects always have to make quantitative assumptions in order to satisfy anticipated loads and expectations of the final product. Many questions are asked before any complex system design that relate to systems performance, infrastructure and components configuration, behavior prediction and bottlenecks fixes. All these questions can be answered using modeling and simulation tools that allow engineers to predict systems behaviors in different settings and optimize systems in production by identifying bottlenecks and flaws in the infrastructure or workflow.

  5. Mooring Design for the Floating Oscillating Water Column Reference Model

    SciTech Connect

    Brefort, Dorian; Bull, Diana L.

    2014-09-01

    To reduce the price of the reference Backward Bent Duct Buoy (BBDB), a study was done analyzing the effects of reducing the mooring line length, and a new mooring design was developed. It was found that the overall length of the mooring lines could be reduced by 1290 meters, allowing a significant price reduction of the system. In this paper, we will first give a description of the model and the storm environment it will be subject to. We will then give a recommendation for the new mooring system, followed by a discussion of the severe weather simulation results, and an analysis of the conservative and aggressive aspects of the design.

  6. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  7. Alloy Design Workbench-Surface Modeling Package Developed

    NASA Technical Reports Server (NTRS)

    Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.

    2003-01-01

    NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.

  8. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  9. A review of design and modeling of magnetorheological valve

    NASA Astrophysics Data System (ADS)

    Abd Fatah, Abdul Yasser; Mazlan, Saiful Amri; Koga, Tsuyoshi; Zamzuri, Hairi; Zeinali, Mohammadjavad; Imaduddin, Fitrian

    2015-01-01

    Following recent rapid development of researches in utilizing Magnetorheological (MR) fluid, a smart material that can be magnetically controlled to change its apparent viscosity instantaneously, a lot of applications have been established to exploit the benefits and advantages of using the MR fluid. One of the most important applications for MR fluid in devices is the MR valve, where it uses the popular flow or valve mode among the available working modes for MR fluid. As such, MR valve is widely applied in a lot of hydraulic actuation and vibration reduction devices, among them are dampers, actuators and shock absorbers. This paper presents a review on MR valve, discusses on several design configurations and the mathematical modeling for the MR valve. Therefore, this review paper classifies the MR valve based on the coil configuration and geometrical arrangement of the valve, and focusing on four different mathematical models for MR valve: Bingham plastic, Herschel-Bulkley, bi-viscous and Herschel-Bulkley with pre-yield viscosity (HBPV) models for calculating yield stress and pressure drop in the MR valve. Design challenges and opportunities for application of MR fluid and MR valve are also highlighted in this review. Hopefully, this review paper can provide basic knowledge on design and modeling of MR valve, complementing other reviews on MR fluid, its applications and technologies.

  10. A multiobjective two-layer transportation network design model

    SciTech Connect

    Figlali, A.; Koc, T.

    1994-12-31

    In this paper, a two-layer transportation network design model with three objectives is presented. The objectives are: (1) to minimize the primary path length (or travelling time) from a predetermined starting node to a predetermined terminus node; (2) to minimize the total distance traversed by the demand to reach the primary path; (3) to minimize the total road construction cost. The problem is formulated as an integer linear programming model and solved as relaxed continuous linear programming model. If fractional values for the variables occur, then a branch and bound algorithm may be employed to obtain 0 or 1 variable values. In the model, three different solution procedure is suggested which depends on the preference position of the decision maker. Such problems are encountered in the applications of the construction of a new rail line between two major cities of a developing country; in the construction of networks of highways and unimproved roads; in the design of airline routes and in the design of energy distribution systems.

  11. A model for designing functionally gradient material joints

    SciTech Connect

    Jou, M.; Messler, R.W.; Orling, T.T.

    1994-12-31

    Joining of dissimilar materials into hybrid structures to meet severe design and service requirements is becoming more necessary and common. Joints between heat-resisting or refractory metals and refractory or corrosion resistant ceramics and intermetallics are especially in demand. Before resorting to a more complicated but versatile finite element analysis (FEA) model, a simpler, more user-friendly analytical layer-model based on a thin plate assumption was developed and tested. The model has been successfully used to design simple FGM joints between Ni-base superalloys or Mo and SiC, Ni{sub 3}Al or Al{sub 2}O{sub 3} using self-propagating high-temperature or pressurized composition synthesis for joining. Cases are presented to demonstrate capability for: (1) varying processing temperature excursions or service gradients; (2) varying overall joint thickness for a fixed number of uniform composition steps; (3) varying the number of uniform steps for a particular overall joint thickness; (4) varying the thickness and/or composition of individual steps for a constant overall thickness; and (5) altering the constitutive law for mixed-material composition steps. The model provides a useful joint design tool for process R&D.

  12. Type-2 fuzzy model based controller design for neutralization processes.

    PubMed

    Kumbasar, Tufan; Eksin, Ibrahim; Guzelkaya, Mujde; Yesil, Engin

    2012-03-01

    In this study, an inverse controller based on a type-2 fuzzy model control design strategy is introduced and this main controller is embedded within an internal model control structure. Then, the overall proposed control structure is implemented in a pH neutralization experimental setup. The inverse fuzzy control signal generation is handled as an optimization problem and solved at each sampling time in an online manner. Although, inverse fuzzy model controllers may produce perfect control in perfect model match case and/or non-existence of disturbances, this open loop control would not be sufficient in the case of modeling mismatches or disturbances. Therefore, an internal model control structure is proposed to compensate these errors in order to overcome this deficiency where the basic controller is an inverse type-2 fuzzy model. This feature improves the closed-loop performance to disturbance rejection as shown through the real-time control of the pH neutralization process. Experimental results demonstrate the superiority of the inverse type-2 fuzzy model controller structure compared to the inverse type-1 fuzzy model controller and conventional control structures. PMID:22036014

  13. Designing Experiments to Discriminate Families of Logic Models

    PubMed Central

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G.; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input–output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration. PMID:26389116

  14. Designing Experiments to Discriminate Families of Logic Models.

    PubMed

    Videla, Santiago; Konokotina, Irina; Alexopoulos, Leonidas G; Saez-Rodriguez, Julio; Schaub, Torsten; Siegel, Anne; Guziolowski, Carito

    2015-01-01

    Logic models of signaling pathways are a promising way of building effective in silico functional models of a cell, in particular of signaling pathways. The automated learning of Boolean logic models describing signaling pathways can be achieved by training to phosphoproteomics data, which is particularly useful if it is measured upon different combinations of perturbations in a high-throughput fashion. However, in practice, the number and type of allowed perturbations are not exhaustive. Moreover, experimental data are unavoidably subjected to noise. As a result, the learning process results in a family of feasible logical networks rather than in a single model. This family is composed of logic models implementing different internal wirings for the system and therefore the predictions of experiments from this family may present a significant level of variability, and hence uncertainty. In this paper, we introduce a method based on Answer Set Programming to propose an optimal experimental design that aims to narrow down the variability (in terms of input-output behaviors) within families of logical models learned from experimental data. We study how the fitness with respect to the data can be improved after an optimal selection of signaling perturbations and how we learn optimal logic models with minimal number of experiments. The methods are applied on signaling pathways in human liver cells and phosphoproteomics experimental data. Using 25% of the experiments, we obtained logical models with fitness scores (mean square error) 15% close to the ones obtained using all experiments, illustrating the impact that our approach can have on the design of experiments for efficient model calibration.

  15. Computational design of patterned interfaces using reduced order models

    PubMed Central

    Vattré, A. J.; Abdolrahim, N.; Kolluri, K.; Demkowicz, M. J.

    2014-01-01

    Patterning is a familiar approach for imparting novel functionalities to free surfaces. We extend the patterning paradigm to interfaces between crystalline solids. Many interfaces have non-uniform internal structures comprised of misfit dislocations, which in turn govern interface properties. We develop and validate a computational strategy for designing interfaces with controlled misfit dislocation patterns by tailoring interface crystallography and composition. Our approach relies on a novel method for predicting the internal structure of interfaces: rather than obtaining it from resource-intensive atomistic simulations, we compute it using an efficient reduced order model based on anisotropic elasticity theory. Moreover, our strategy incorporates interface synthesis as a constraint on the design process. As an illustration, we apply our approach to the design of interfaces with rapid, 1-D point defect diffusion. Patterned interfaces may be integrated into the microstructure of composite materials, markedly improving performance. PMID:25169868

  16. A Design of Product Collaborative Online Configuration Model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoguo; Zheng, Jin; Zeng, Qian

    According to the actual needs of mass customization, the personalization of product and its collaborative design, the paper analyzes and studies the working mechanism of modular-based product configuration technology and puts forward an information model of modular product family. Combined with case-based reasoning techniques (CBR) and the constraint satisfaction problem solving techniques (CSP), we design and study the algorithm for product configuration, and analyze its time complexity. A car chassis is made as the application object, we provide a prototype system of online configuration. Taking advantage of this system, designers can make appropriate changes on the existing programs in accordance with the demand. This will accelerate all aspects of product development and shorten the product cycle. Also the system will provide a strong technical support for enterprises to improve their market competitiveness.

  17. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model

    SciTech Connect

    Mian, Muhammad Umer Khir, M. H. Md.; Tang, T. B.; Dennis, John Ojur; Riaz, Kashif; Iqbal, Abid; Bazaz, Shafaat A.

    2015-07-22

    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.

  18. MEMS 3-DoF gyroscope design, modeling and simulation through equivalent circuit lumped parameter model

    NASA Astrophysics Data System (ADS)

    Mian, Muhammad Umer; Dennis, John Ojur; Khir, M. H. Md.; Riaz, Kashif; Iqbal, Abid; Bazaz, Shafaat A.; Tang, T. B.

    2015-07-01

    Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for the proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.

  19. Dynamic model for controller design of condensate throttling systems.

    PubMed

    Hu, Yong; Zeng, De-Liang; Liu, Ji-Zhen; Zhao, Zheng; Li, Ya-zhe

    2015-09-01

    Improving the load adjustment rate of coal-fired power plants in China is very important because of grid load fluctuations and the construction of new large-scale power plants connected to the country's power grid. In this paper, a new application of condensate throttling system for rapid load adjustment is proposed on the basis of the characteristics of turbine-stored energy. To ensure effective and safe operation of the condensate throttling system, a non-linear control model is derived through reasonable simplifications of fundamental physical laws, and the model parameters are identified using experimental data from a 660 MW supercritical coal-fired power plant. The model outputs are compared with actual measured data for different unit loads. Results show that the established model's responses strongly correlate with the actual unit's responses and can be used for controller design. PMID:26206068

  20. Dynamic model for controller design of condensate throttling systems.

    PubMed

    Hu, Yong; Zeng, De-Liang; Liu, Ji-Zhen; Zhao, Zheng; Li, Ya-zhe

    2015-09-01

    Improving the load adjustment rate of coal-fired power plants in China is very important because of grid load fluctuations and the construction of new large-scale power plants connected to the country's power grid. In this paper, a new application of condensate throttling system for rapid load adjustment is proposed on the basis of the characteristics of turbine-stored energy. To ensure effective and safe operation of the condensate throttling system, a non-linear control model is derived through reasonable simplifications of fundamental physical laws, and the model parameters are identified using experimental data from a 660 MW supercritical coal-fired power plant. The model outputs are compared with actual measured data for different unit loads. Results show that the established model's responses strongly correlate with the actual unit's responses and can be used for controller design.

  1. Trilevel interaction design model for pilot part-task training

    SciTech Connect

    Roman, J.H.; Pistone, R.A.; Stoddard, M.L.

    1986-01-01

    Development of effective, scenario-driven training exercises requires both an instructional design and a delivery system that match the subject domain and needs of the students. The Training Research Team at Los Alamos National Laboratory conducts research and development of prototype training systems. One of the Team's efforts is a joint research project, supported with funding and behavioral science guidance from the Army Research Institute, to develop a prototype part-task trainer for student helicopter pilots. The Team designed a ''trilevel interaction'' model and a Level III interactive videodisc delivery system for this project. The model, founded on instructional and psychological theory, should be transferable to other domains where part-task training is appropriate.

  2. Dynamic Modeling in Solid-Oxide Fuel Cells Controller Design

    SciTech Connect

    Lu, Ning; Li, Qinghe; Sun, Xin; Khaleel, Mohammad A.

    2007-06-28

    In this paper, a dynamic model of the solid-oxide fuel cell (SOFC) power unit is developed for the purpose of designing a controller to regulate fuel flow rate, fuel temperature, air flow rate, and air temperature to maintain the SOFC stack temperature, fuel utilization rate, and voltage within operation limits. A lumped model is used to consider the thermal dynamics and the electro-chemial dynamics inside an SOFC power unit. The fluid dynamics at the fuel and air inlets are considered by using the in-flow ramp-rates.

  3. Business model design for a wearable biofeedback system.

    PubMed

    Hidefjäll, Patrik; Titkova, Dina

    2015-01-01

    Wearable sensor technologies used to track daily activities have become successful in the consumer market. In order for wearable sensor technology to offer added value in the more challenging areas of stress-rehab care and occupational health stress-related biofeedback parameters need to be monitored and more elaborate business models are needed. To identify probable success factors for a wearable biofeedback system (Affective Health) in the two mentioned market segments in a Swedish setting, we conducted literature studies and interviews with relevant representatives. Data were collected and used first to describe the two market segments and then to define likely feasible business model designs, according to the Business Model Canvas framework. Needs of stakeholders were identified as inputs to business model design. Value propositions, a key building block of a business model, were defined for each segment. The value proposition for occupational health was defined as "A tool that can both identify employees at risk of stress-related disorders and reinforce healthy sustainable behavior" and for healthcare as: "Providing therapists with objective data about the patient's emotional state and motivating patients to better engage in the treatment process".

  4. Magnet designation: a model for home healthcare practice.

    PubMed

    Browning, Sarah Via; Clark, Rebecca Culver

    2015-01-01

    Nurses at 1 hospital-affiliated home healthcare agency (HHA) found that being a department of a Magnet-accredited hospital had a significant impact on the culture of their HHA. Important lessons were learned in conjunction with the Magnet designation journey. In this article, the authors describe the history of the Magnet recognition program, the components of the Magnet model, and how these are applicable to nursing practice within HHAs. PMID:25654343

  5. Design, modelling and simulation aspects of an ankle rehabilitation device

    NASA Astrophysics Data System (ADS)

    Racu, C. M.; Doroftei, I.

    2016-08-01

    Ankle injuries are amongst the most common injuries of the lower limb. Besides initial treatment, rehabilitation of the patients plays a crucial role for future activities and proper functionality of the foot. Traditionally, ankle injuries are rehabilitated via physiotherapy, using simple equipment like elastic bands and rollers, requiring intensive efforts of therapists and patients. Thus, the need of robotic devices emerges. In this paper, the design concept and some modelling and simulation aspects of a novel ankle rehabilitation device are presented.

  6. Monitoring Network Design for Discriminating and Reducing Models in Bayesian Model Averaging Paradigm

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Pham, H. V.

    2013-12-01

    Bayesian model averaging (BMA) is often adopted to quantify model prediction and uncertainty using multiple models generated from various sources of uncertainty. Due to the lack of data and knowledge, the number of models with non-dominant posterior model probabilities can be overwhelming. Conducting prediction and uncertainty analysis using a great deal of computationally intensive simulation models (e.g., groundwater models) can become intractable under the BMA framework. Moreover, prediction results using the BMA can be useless when prediction uncertainty is very high. This study implements a monitoring network design under the BMA framework to discriminate groundwater models and in turn reduce the number of models. The posterior model probabilities are re-evaluated by using BMA prediction as 'future observation data' and historical data. Given a design criterion of posterior model probability (e.g. 85%), the monitoring network design aims to find the optimal number and location of monitoring wells at existing wells for continuous observation. If using existing wells cannot achieve the design criterion, then exploration of new monitoring well location is necessary. Once the design criterion is met, other models will be discriminated from the best model. Between-model variance will be significantly reduced. We use the monitoring network design to discriminate 18 complex groundwater models that include the '1,200-foot', '1,500-foot', and '1,700-foot' sands in the Baton Rouge area, southeastern Louisiana. The sources of uncertainty that creates the groundwater models are from hydrostratigraphic architecture, fault permeability architecture, and boundary conditions. To speed up model calibration, we develop a parallel version of CMA-ES and implement it to SuperMike II cluster at Louisiana State University. Results show that in the model calibration period from 1975 to 2010, eleven models have posterior model probabilities ranging from 3.5% to 17.4%. The purpose of

  7. Design, modeling and simulation of MEMS-based silicon Microneedles

    NASA Astrophysics Data System (ADS)

    Amin, F.; Ahmed, S.

    2013-06-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  8. Data Mining Approaches for Modeling Complex Electronic Circuit Design Activities

    SciTech Connect

    Kwon, Yongjin; Omitaomu, Olufemi A; Wang, Gi-Nam

    2008-01-01

    A printed circuit board (PCB) is an essential part of modern electronic circuits. It is made of a flat panel of insulating materials with patterned copper foils that act as electric pathways for various components such as ICs, diodes, capacitors, resistors, and coils. The size of PCBs has been shrinking over the years, while the number of components mounted on these boards has increased considerably. This trend makes the design and fabrication of PCBs ever more difficult. At the beginning of design cycles, it is important to estimate the time to complete the steps required accurately, based on many factors such as the required parts, approximate board size and shape, and a rough sketch of schematics. Current approach uses multiple linear regression (MLR) technique for time and cost estimations. However, the need for accurate predictive models continues to grow as the technology becomes more advanced. In this paper, we analyze a large volume of historical PCB design data, extract some important variables, and develop predictive models based on the extracted variables using a data mining approach. The data mining approach uses an adaptive support vector regression (ASVR) technique; the benchmark model used is the MLR technique currently being used in the industry. The strengths of SVR for this data include its ability to represent data in high-dimensional space through kernel functions. The computational results show that a data mining approach is a better prediction technique for this data. Our approach reduces computation time and enhances the practical applications of the SVR technique.

  9. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  10. Design Models as Emergent Features: An Empirical Study in Communication and Shared Mental Models in Instructional

    ERIC Educational Resources Information Center

    Botturi, Luca

    2006-01-01

    This paper reports the results of an empirical study that investigated the instructional design process of three teams involved in the development of an e-­learning unit. The teams declared they were using the same fast-­prototyping design and development model, and were composed of the same roles (although with a different number of SMEs).…

  11. An Integral ASIE ID Model: The 21st Century Instructional Design Model for Teachers

    ERIC Educational Resources Information Center

    Zain, Ismail Md.; Muniandy, Balakrishnan; Hashim, Wahid

    2016-01-01

    Design of instruction is an important feature in teacher education at fulfilling the needs of 4Cs ("critical thinker," "communicator," "collaborator," "creator") developing "a globally competitive learners". As Instructional design models (ID) need to move from adopting a standard approach to…

  12. An uncertain multidisciplinary design optimization method using interval convex models

    NASA Astrophysics Data System (ADS)

    Li, Fangyi; Luo, Zhen; Sun, Guangyong; Zhang, Nong

    2013-06-01

    This article proposes an uncertain multi-objective multidisciplinary design optimization methodology, which employs the interval model to represent the uncertainties of uncertain-but-bounded parameters. The interval number programming method is applied to transform each uncertain objective function into two deterministic objective functions, and a satisfaction degree of intervals is used to convert both the uncertain inequality and equality constraints to deterministic inequality constraints. In doing so, an unconstrained deterministic optimization problem will be constructed in association with the penalty function method. The design will be finally formulated as a nested three-loop optimization, a class of highly challenging problems in the area of engineering design optimization. An advanced hierarchical optimization scheme is developed to solve the proposed optimization problem based on the multidisciplinary feasible strategy, which is a well-studied method able to reduce the dimensions of multidisciplinary design optimization problems by using the design variables as independent optimization variables. In the hierarchical optimization system, the non-dominated sorting genetic algorithm II, sequential quadratic programming method and Gauss-Seidel iterative approach are applied to the outer, middle and inner loops of the optimization problem, respectively. Typical numerical examples are used to demonstrate the effectiveness of the proposed methodology.

  13. Design and modeling of a compact imaging spectrometer

    NASA Astrophysics Data System (ADS)

    Feng, Chen; Ahmad, Anees

    1995-11-01

    A novel low-f-number, wide-field-of-view imaging spectrometer has been designed for measuring the day-glow spectrum over the wavelength range of 260 to 870 nm with spectral resolutions of 0.5 and 0.03 nm. The zero-obstruction all-reflective design is an f/2.0 imaging spectrograph using commercial gratings. The field of view along the spatial direction is 6 deg, with a spatial resolution of 0.1 mrad. The spectrometer is designed to work with a commercially available 1037 X 1340 CCD detector with 6.8 X 6.8-micrometers pixel size. The imaging spectrometer optics consists of an aspheric toroidal telescope, a slit, an aspheric toroidal collimator, a planar reflective grating, and three off-axis higher-order aspheric imaging mirrors. Significant improvements in the performance have been achieved by introducing aspheric toroidal elements in the design. The weight and size have been reduced by a factor of 20 as compared to previous similar instruments. A virtual prototype of the instrument has also been modeled by using integrated optical and mechanical design software.

  14. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-01

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.

  15. Predictive Model for the Design of Zwitterionic Polymer Brushes: A Statistical Design of Experiments Approach.

    PubMed

    Kumar, Ramya; Lahann, Joerg

    2016-07-01

    The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %. PMID:27268965

  16. A biomimetic approach for designing stent-graft structures: Caterpillar cuticle as design model.

    PubMed

    Singh, Charanpreet; Wang, Xungai

    2014-02-01

    Stent-graft (SG) induced biomechanical mismatch at the aortic repair site forms the major reason behind postoperative hemodynamic complications. These complications arise from mismatched radial compliance and stiffness property of repair device relative to native aortic mechanics. The inability of an exoskeleton SG design (an externally stented rigid polyester graft) to achieve optimum balance between structural robustness and flexibility constrains its biomechanical performance limits. Therefore, a new SG design capable of dynamically controlling its stiffness and flexibility has been proposed in this study. The new design is adopted from the segmented hydroskeleton structure of a caterpillar cuticle and comprises of high performance polymeric filaments constructed in a segmented knit architecture. Initially, conceptual design models of caterpillar and SG were developed and later translated into an experimental SG prototype. The in-vitro biomechanical evaluation (compliance, bending moment, migration intensity, and viscoelasticity) revealed significantly better performance of hydroskeleton structure than a commercial SG device (Zenith(™) Flex SG) and woven Dacron(®) graft-prosthesis. Structural segmentation improved the biomechanical behaviour of new SG by inducing a three dimensional volumetric expansion property when the SG was subjected to hoop stresses. Interestingly, this behaviour matches the orthotropic elastic property of native aorta and hence proposes segmented hydroskeleton structures as promising design approach for future aortic repair devices. PMID:24216309

  17. Software Engineering Designs for Super-Modeling Different Versions of CESM Models using DART

    NASA Astrophysics Data System (ADS)

    Kluzek, Erik; Duane, Gregory; Tribbia, Joe; Vertenstein, Mariana

    2014-05-01

    The super-modeling approach connects different models together at run time in order to provide run time feedbacks between the models and thus synchronize the models. This method reduces model bias further than after-the-fact averaging of model outputs. We explore different designs to connect different configurations and versions of an IPCC class climate model - the Community Earth System Model (CESM). We build on the Data Assimilation Research Test-bed (DART) software to provide data assimilation from truth as well as to provide a software framework to link different model configurations together. We show a system building on DART that uses a Python script to do simple nudging between three versions of the atmosphere model in CESM (the Community Atmosphere Model (CAM) versions three, four and five).

  18. "What's the science behind it?" Models and modeling in a design for science classroom

    NASA Astrophysics Data System (ADS)

    Leonard, Mary J.

    "Design for Science" curricula have sprung up in middle- and high-school science classrooms in recent years, attracted by the promise of increased student motivation and improved learning outcomes to be gained from providing real-world, engineering problem-solving contexts for science concepts. But while engineering and science are deeply interrelated domains of practice, they have epistemological differences that may create difficulties for students (and teachers) engaged in such activities. This study investigated how an enacted "design for science" activity afforded and constrained development of science conceptual knowledge. The study focused on approximately two weeks of video from a middle-school science classroom in which students were challenged to design and build a balloon-powered model car for the purpose of learning about forces. The study was grounded in socio-cultural theory, employing activity theory and discourse analysis in an ethnographically-grounded approach. The study revealed that although it was not specifically addressed in the curriculum, the enacted activity requires students and teachers to engage in developing and communicating models of balloon car motion, and furthermore, that models and modeling have the potential for bridging designs and science concepts. The study contributes an epistemological framework for investigating such "design for science" activities, furthers our understanding of what happens in such classrooms, and offers a models and modeling construct as a promising way to create more effective engineering design contexts for science learning.

  19. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  20. Jovian Proton and Heavy Ion Models for Spacecraft Design

    NASA Astrophysics Data System (ADS)

    Garrett, H. B.; Evans, R. W.; Jun, I.; Kim, W.

    2015-12-01

    This presentation will review the results of the latest modeling at the Jet Propulsion Laboratory of the high energy proton and ion environments at Jupiter. The existing models of the proton and ion environments at Jupiter have been revised and extended from the original 12 jovian radii out to 50 jovian radii using the latest Galileo data. In addition to the physical significance of these particle populations, the new models will be important in the evaluation and design of solar arrays at Jupiter as they can affect the radiation damage to the solar array cells and cover glass. The new models represent an important update to the tools currently being used to study the effects of the jovian environment on spacecraft. These models (part of the GIRE family of electron and proton models) are currently used worldwide to describe that jovian environment and are the main tools used by NASA to determine the effects of this environment on spacecraft systems and instruments. The update to be presented is the first significant revision (extending the proton and ion models from 12 Rj to 50 Rj) to the GIRE proton environment since 1983 and fills an important gap in our understanding.

  1. SDU 6 MODELING STUDY TO SUPPORT DESIGN DEVELOPMENT

    SciTech Connect

    Smith, F.

    2012-05-02

    In response to Technical Task Request (TTR) HLW-SSF-TTR-2012-0017 (1), SRNL performed modeling studies to evaluate alternative design features for the 32 million gallon Saltstone Disposal Unit (SDU) referred to as SDU 6. This initial modeling study was intended to assess the performance of major components of the structure that are most significant to the PA. Information provided by the modeling will support the development of a SDU 6 Preliminary Design Model and Recommendation Report to be written by SRR Closure and Waste Disposal Authority. Key inputs and assumptions for the modeling were provided to SRNL in SRR-SPT-2011-00113 (2). A table reiterates the base case and four sensitivity case studies requested in this reference. In general, as shown in Table 4, when compared to Vault 2 Case A, the Base Case SDU 6 design produced higher peak fluxes to the water table during the 10,000 year period of analysis but lower peak fluxes within a 15,000 to 20,000 time frame. SDU 6 will contain approximately ten times the inventory of a single Vault 2 and the SDU 6 footprint is comparable to that of a group of four Vault 2 disposal units. Therefore, the radionuclide flux from SDU 6 and that from a single Vault 2 are not directly comparable. A more direct comparison would be to compare the maximum dose obtained at the 100 m boundary from the seven SDU's that will replace the 64 FDC's analyzed in the 2009 PA. This analysis will be performed in the next set of calculations planned for SDU design evaluation. Aquifer transport and dose calculations were not intended to be part of this initial scoping study. However, results from this study do indicate that replacement of the FDC design with SDU would not yield significantly higher peak doses. If the thickness of the SDU 6 floor is increased, peak doses would not occur during the 10,000 year period of analysis.

  2. Statistical mechanics of simple models of protein folding and design.

    PubMed Central

    Pande, V S; Grosberg, A Y; Tanaka, T

    1997-01-01

    It is now believed that the primary equilibrium aspects of simple models of protein folding are understood theoretically. However, current theories often resort to rather heavy mathematics to overcome some technical difficulties inherent in the problem or start from a phenomenological model. To this end, we take a new approach in this pedagogical review of the statistical mechanics of protein folding. The benefit of our approach is a drastic mathematical simplification of the theory, without resort to any new approximations or phenomenological prescriptions. Indeed, the results we obtain agree precisely with previous calculations. Because of this simplification, we are able to present here a thorough and self contained treatment of the problem. Topics discussed include the statistical mechanics of the random energy model (REM), tests of the validity of REM as a model for heteropolymer freezing, freezing transition of random sequences, phase diagram of designed ("minimally frustrated") sequences, and the degree to which errors in the interactions employed in simulations of either folding and design can still lead to correct folding behavior. Images FIGURE 2 FIGURE 3 FIGURE 4 FIGURE 6 PMID:9414231

  3. Equivalent plate modeling for conceptual design of aircraft wing structures

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1995-01-01

    This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.

  4. Blueprint for design: creating models that direct change.

    PubMed

    Wolf, Gail A; Greenhouse, Pamela K

    2007-09-01

    The need for healthcare system change is overwhelming: broken systems, an inadequate workforce, patient safety failures, and lack of access to healthcare are but a few of the significant problems confronting healthcare today. Most people agree that things need to change, but they are not sure on what to change or how to change them. A well-developed model can serve as the currency for that change. By considering the evidence of what has worked well in the past and marrying that with what the future will require, we can create a strong sustainable model for professional practice. The authors explore that evidence and discuss the structure, process, and outcomes that should be considered in designing care delivery models for the future. PMID:17823570

  5. Model Free Gate Design and Calibration For Superconducting Qubits

    NASA Astrophysics Data System (ADS)

    Egger, Daniel; Wilhelm, Frank

    2014-03-01

    Gates for superconducting qubits are realized by time dependent control pulses. The pulse shape for a specific gate depends on the parameters of the superconducting qubits, e.g. frequency and non-linearity. Based on ones knowledge of these parameters and using a specific model the pulse shape is determined either analytically or numerically using optimal control [arXiv:1306.6894, arXiv:1306.2279]. However the performance of the pulse is limited by the accuracy of the model. For a pulse with few parameters this is generally not a problem since it can be ``debugged'' manually. He we present an automated method for calibrating multiparameter pulses. We use the Nelder-Mead simplex method to close the control loop. This scheme uses the experiment as feedback and thus does not need a model. It requires few iterations and circumvents process tomogrophy, therefore making it a fast and versatile tool for gate design.

  6. An airfoil pitch apparatus-modeling and control design

    NASA Astrophysics Data System (ADS)

    Andrews, Daniel R.

    1989-03-01

    The study of dynamic stall of rapidly pitching airfoils is being conducted at NASA Ames Research Center. Understanding this physical phenomenon will aid in improving the maneuverability of fighter aircraft as well as civilian aircraft. A wind tunnel device which can linearly pitch and control an airfoil with rapid dynamic response is needed for such tests. To develop a mechanism capable of high accelerations, an accurate model and control system is created. The model contains mathematical representations of the mechanical system, including mass, spring, and damping characteristics for each structural element, as well as coulomb friction and servovalve saturation. Electrical components, both digital and analog, linear and nonlinear, are simulated. The implementation of such a high-performance system requires detailed control design as well as state-of-the-art components. This paper describes the system model, states the system requirements, and presents results of its theoretical performance which maximizes the structural and hydraulic aspects of this system.

  7. Experimental Verification of Structural-Acoustic Modelling and Design Optimization

    NASA Astrophysics Data System (ADS)

    MARBURG, S.; BEER, H.-J.; GIER, J.; HARDTKE, H.-J.; RENNERT, R.; PERRET, F.

    2002-05-01

    A number of papers have been published on the simulation of structural-acoustic design optimization. However, extensive work is required to verify these results in practical applications. Herein, a steel box of 1·0×1·1×1·5 m with an external beam structure welded on three surface plates was investigated. This investigation included experimental modal analysis and experimental measurements of certain noise transfer functions (sound pressure at points inside the box due to force excitation at beam structure). Using these experimental data, the finite element model of the structure was tuned to provide similar results. With a first structural mode at less than 20 Hz, the reliable frequency range was identified up to about 60 Hz. Obviously, the finite element model could not be further improved only by mesh refinement. The tuning process will be explained in detail since there was a number of changes that helped to improve the structure. Other changes did not improve the structure. Although this model of the box could be expected as a rather simple structure, it can be considered to be a complex structure for simulation purposes. A defined modification of the physical model verified the simulation model. In a final step, the optimal location of stiffening beam structures was predicted by simulation. Their effect on the noise transfer function was experimentally verified. This paper critically discusses modelling techniques that are applied for structural-acoustic simulation of sedan bodies.

  8. Design theoretic analysis of three system modeling frameworks.

    SciTech Connect

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  9. An Integrated Magnetic Circuit Model and Finite Element Model Approach to Magnetic Bearing Design

    NASA Technical Reports Server (NTRS)

    Provenza, Andrew J.; Kenny, Andrew; Palazzolo, Alan B.

    2003-01-01

    A code for designing magnetic bearings is described. The code generates curves from magnetic circuit equations relating important bearing performance parameters. Bearing parameters selected from the curves by a designer to meet the requirements of a particular application are input directly by the code into a three-dimensional finite element analysis preprocessor. This means that a three-dimensional computer model of the bearing being developed is immediately available for viewing. The finite element model solution can be used to show areas of magnetic saturation and make more accurate predictions of the bearing load capacity, current stiffness, position stiffness, and inductance than the magnetic circuit equations did at the start of the design process. In summary, the code combines one-dimensional and three-dimensional modeling methods for designing magnetic bearings.

  10. Design and mathematical modelling of a synthetic symbiotic ecosystem.

    PubMed

    Kambam, P K R; Henson, M A; Sun, L

    2008-01-01

    Artificial microbial ecosystems have been increasingly used to understand principles of ecology. These systems offer unique capabilities to mimic a variety of ecological interactions that otherwise would be difficult to study experimentally in a reasonable period of time. However, the elucidation of the genetic bases for these interactions remains a daunting challenge. To address this issue, we have designed and analysed a synthetic symbiotic ecosystem in which the genetic nature of the microbial interactions is defined explicitly. A mathematical model of the gene regulatory network in each species and their interaction through quorum sensing mediated intercellular signalling was derived to investigate the effect of system components on cooperative behaviour. Dynamic simulation and bifurcation analysis showed that the designed system admits a stable coexistence steady state for sufficiently large initial cell concentrations of the two species. The steady-state fraction of each species could be altered by varying model parameters associated with gene transcription and signalling molecule synthesis rates. The design also admitted a stable steady state corresponding to extinction of the two species for low initial cell concentrations and stable periodic solutions over certain domains of parameter space. The mathematical analysis was shown to provide insights into natural microbial ecosystems and to allow identification of molecular targets for engineering system behaviour.

  11. Optimal Observation Network Design for Model Discrimination using Information Theory and Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Pham, H. V.; Tsai, F. T. C.

    2014-12-01

    Groundwater systems are complex and subject to multiple interpretations and conceptualizations due to a lack of sufficient information. As a result, multiple conceptual models are often developed and their mean predictions are preferably used to avoid biased predictions from using a single conceptual model. Yet considering too many conceptual models may lead to high prediction uncertainty and may lose the purpose of model development. In order to reduce the number of models, an optimal observation network design is proposed based on maximizing the Kullback-Leibler (KL) information to discriminate competing models. The KL discrimination function derived by Box and Hill [1967] for one additional observation datum at a time is expanded to account for multiple independent spatiotemporal observations. The Bayesian model averaging (BMA) method is used to incorporate existing data and quantify future observation uncertainty arising from conceptual and parametric uncertainties in the discrimination function. To consider the future observation uncertainty, the Monte Carlo realizations of BMA predicted future observations are used to calculate the mean and variance of posterior model probabilities of the competing models. The goal of the optimal observation network design is to find the number and location of observation wells and sampling rounds such that the highest posterior model probability of a model is larger than a desired probability criterion (e.g., 95%). The optimal observation network design is implemented to a groundwater study in the Baton Rouge area, Louisiana to collect new groundwater heads from USGS wells. The considered sources of uncertainty that create multiple groundwater models are the geological architecture, the boundary condition, and the fault permeability architecture. All possible design solutions are enumerated using high performance computing systems. Results show that total model variance (the sum of within-model variance and between-model

  12. Reference Model 2: %22Rev 0%22 Rotor Design.

    SciTech Connect

    Barone, Matthew F.; Berg, Jonathan Charles; Griffith, Daniel

    2011-12-01

    The preliminary design for a three-bladed cross-flow rotor for a reference marine hydrokinetic turbine is presented. A rotor performance design code is described, along with modifications to the code to allow prediction of blade support strut drag as well as interference between two counter-rotating rotors. The rotor is designed to operate in a reference site corresponding to a riverine environment. Basic rotor performance and rigid-body loads calculations are performed to size the rotor elements and select the operating speed range. The preliminary design is verified with a simple finite element model that provides estimates of bending stresses during operation. A concept for joining the blades and support struts is developed and analyzed with a separate finite element analysis. Rotor mass, production costs, and annual energy capture are estimated in order to allow calculations of system cost-of-energy. Evaluation Only. Created with Aspose.Pdf.Kit. Copyright 2002-2011 Aspose Pty Ltd Evaluation Only. Created with Aspose.Pdf.Kit. Copyright 2002-2011 Aspose Pty Ltd

  13. Design, modeling, and fabrication of piezoelectric polymer actuators

    NASA Astrophysics Data System (ADS)

    Fu, Yao; Harvey, Erol C.; Ghantasala, Muralidhar K.; Spinks, Geoff

    2004-04-01

    Piezoelectric polymers are a class of materials with great potential and promise for many applications. Because of their ideally suitable characteristics, they make good candidates for actuators. However, the difficulty of forming structures and shapes has limited the range of mechanical design. In this work, the design and fabrication of a unimorph piezoelectric cantilever actuator using piezoelectric polymer PVDF with an electroplated layer of nickel alloy has been described. The modeling and simulation of the composite cantilever with planar and microstructured surfaces has been performed by CoventorWare to optimize the design parameters in order to achieve large tip deflections. These simulation results indicated that a microstructured cantilever could produce 25 percent higher deflection compared to a simple planar cantilever surface. The tip deflection of the composite cantilever with a length of 6mm and a width of 1mm can reach up to 100μm. A PVDF polymer with a specifically designed shape was punched out along the elongation direction on the embossing machine at room temperature. The nickel alloy layer was electroplated on one side of the PVDF to form a composite cantilever. The tip deflection of the cantilever was observed and measured under an optical microscope. The experimental result is in agreement with the theoretical analysis.

  14. A Model of Reading Teaching for University EFL Students: Need Analysis and Model Design

    ERIC Educational Resources Information Center

    Hamra, Arifuddin; Syatriana, Eny

    2012-01-01

    This study designed a model of teaching reading for university EFL students based on the English curriculum at the Faculty of Languages and Literature and the concept of the team-based learning in order to improve the reading comprehension of the students. What kind of teaching model can help students to improve their reading comprehension? The…

  15. Analyst-centered models for systems design, analysis, and development

    NASA Technical Reports Server (NTRS)

    Bukley, A. P.; Pritchard, Richard H.; Burke, Steven M.; Kiss, P. A.

    1988-01-01

    Much has been written about the possible use of Expert Systems (ES) technology for strategic defense system applications, particularly for battle management algorithms and mission planning. It is proposed that ES (or more accurately, Knowledge Based System (KBS)) technology can be used in situations for which no human expert exists, namely to create design and analysis environments that allow an analyst to rapidly pose many different possible problem resolutions in game like fashion and to then work through the solution space in search of the optimal solution. Portions of such an environment exist for expensive AI hardware/software combinations such as the Xerox LOOPS and Intellicorp KEE systems. Efforts are discussed to build an analyst centered model (ACM) using an ES programming environment, ExperOPS5 for a simple missile system tradeoff study. By analyst centered, it is meant that the focus of learning is for the benefit of the analyst, not the model. The model's environment allows the analyst to pose a variety of what if questions without resorting to programming changes. Although not an ES per se, the ACM would allow for a design and analysis environment that is much superior to that of current technologies.

  16. Design Considerations, Modeling and Analysis for the Multispectral Thermal Imager

    SciTech Connect

    Borel, C.C.; Clodius, W.B.; Cooke, B.J.; Smith, B.W.; Weber, P.G.

    1999-02-01

    The design of remote sensing systems is driven by the need to provide cost-effective, substantive answers to questions posed by our customers. This is especially important for space-based systems, which tend to be expensive, and which generally cannot be changed after they are launched. We report here on the approach we employed in developing the desired attributes of a satellite mission, namely the Multispectral Thermal Imager. After an initial scoping study, we applied a procedure which we call: "End-to-end modeling and analysis (EEM)." We began with target attributes, translated to observable signatures and then propagated the signatures through the atmosphere to the sensor location. We modeled the sensor attributes to yield a simulated data stream, which was then analyzed to retrieve information about the original target. The retrieved signature was then compared to the original to obtain a figure of merit: hence the term "end-to-end modeling and analysis." We base the EEM in physics to ensure high fidelity and to permit scaling. As the actual design of the payload evolves, and as real hardware is tested, we can update the EEM to facilitate trade studies, and to judge, for example, whether components that deviate from specifications are acceptable.

  17. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  18. A Strip-Type Microthrottle Pump: Modeling, Design and Fabrication

    PubMed Central

    Pečar, Borut; Vrtačnik, Danilo; Resnik, Drago; Možek, Matej; Aljančič, Uroš; Dolžan, Tine; Amon, Slavko; Križaj, Dejan

    2013-01-01

    A novel design for a strip-type microthrottle pump with a rectangular actuator geometry is proposed, with more efficient chip surface consumption compared to existing micropumps with circular actuators. Due to the complex structure and operation of the proposed device, determination of detailed structural parameters is essential. Therefore, we developed an advanced, fully coupled 3D electro-fluid-solid mechanics simulation model in COMSOL that includes fluid inertial effects and a hyperelastic model for PDMS and no-slip boundary condition in fluid-wall interface. Numerical simulation resulted in accurate virtual prototyping of the proposed device only after inclusion of all mentioned effects. Here, we provide analysis of device operation at various frequencies which describes the basic pumping effects, role of excitation amplitude and backpressure and provides optimization of critical design parameters such as optimal position and height of the microthrottles. Micropump prototypes were then fabricated and characterized. Measured characteristics proved expected micropump operation, achieving maximal flow-rate 0.43 mL·min−1 and maximal backpressure 12.4 kPa at 300 V excitation. Good agreement between simulation and measurements on fabricated devices confirmed the correctness of the developed simulation model. PMID:23459391

  19. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    PubMed

    Alford, Rebecca F; Koehler Leman, Julia; Weitzner, Brian D; Duran, Amanda M; Tilley, Drew C; Elazar, Assaf; Gray, Jeffrey J

    2015-09-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  20. Amplified energy harvester from footsteps: design, modeling, and experimental analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ya; Chen, Wusi; Guzman, Plinio; Zuo, Lei

    2014-04-01

    This paper presents the design, modeling and experimental analysis of an amplified footstep energy harvester. With the unique design of amplified piezoelectric stack harvester the kinetic energy generated by footsteps can be effectively captured and converted into usable DC power that could potentially be used to power many electric devices, such as smart phones, sensors, monitoring cameras, etc. This doormat-like energy harvester can be used in crowded places such as train stations, malls, concerts, airport escalator/elevator/stairs entrances, or anywhere large group of people walk. The harvested energy provides an alternative renewable green power to replace power requirement from grids, which run on highly polluting and global-warming-inducing fossil fuels. In this paper, two modeling approaches are compared to calculate power output. The first method is derived from the single degree of freedom (SDOF) constitutive equations, and then a correction factor is applied onto the resulting electromechanically coupled equations of motion. The second approach is to derive the coupled equations of motion with Hamilton's principle and the constitutive equations, and then formulate it with the finite element method (FEM). Experimental testing results are presented to validate modeling approaches. Simulation results from both approaches agree very well with experimental results where percentage errors are 2.09% for FEM and 4.31% for SDOF.

  1. The Design of Model-Based Training Programs

    NASA Technical Reports Server (NTRS)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  2. Multiscale modeling for materials design: Molecular square catalysts

    NASA Astrophysics Data System (ADS)

    Majumder, Debarshi

    In a wide variety of materials, including a number of heterogeneous catalysts, the properties manifested at the process scale are a consequence of phenomena that occur at different time and length scales. Recent experimental developments allow materials to be designed precisely at the nanometer scale. However, the optimum design of such materials requires capabilities to predict the properties at the process scale based on the phenomena occurring at the relevant scales. The thesis research reported here addresses this need to develop multiscale modeling strategies for the design of new materials. As a model system, a new system of materials called molecular squares was studied in this research. Both serial and parallel multiscale strategies and their components were developed as parts of this work. As a serial component, a parameter estimation tool was developed that uses a hierarchical protocol and consists of two different search elements: a global search method implemented using a genetic algorithm that is capable of exploring large parametric space, and a local search method using gradient search techniques that accurately finds the optimum in a localized space. As an essential component of parallel multiscale modeling, different standard as well as specialized computational fluid dynamics (CFD) techniques were explored and developed in order to identify a technique that is best suited to solve a membrane reactor model employing layered films of molecular squares as the heterogeneous catalyst. The coupled set of non-linear partial differential equations (PDEs) representing the continuum model was solved numerically using three different classes of methods: a split-step method using finite difference (FD); domain decomposition in two different forms, one involving three overlapping subdomains and the other involving a gap-tooth scheme; and the multiple-timestep method that was developed in this research. The parallel multiscale approach coupled continuum

  3. Dynamic Model Tests of Models of the McDonnell Design of Project Mercury Capsule

    NASA Technical Reports Server (NTRS)

    1961-01-01

    Dynamic Model Tests of Models of the McDonnell Design of Project Mercury Capsule in the Langley 20-Foot Free-Spinning Tunnel. On 10 June 1961, 33 tests of the aerodynamic response of the McDonnell model Mercury capsule were conducted. Variables included spin, different parachute tethers, and the addition of baffles. [Entire movie available on DVD from CASI as Doc ID 20070030951. Contact help@sti.nasa.gov

  4. Models in towing basins/design of floats for seaplanes

    NASA Technical Reports Server (NTRS)

    1931-01-01

    Towing carriage gear housing in the Tow Tank. Starr Truscott described the Towing Carriage in NACA TR No. 470: ' The structure of the towing carriage is of carbon-steel tube, with all joints welded. In order to insure accuracy of alinement (sic.) in the girders forming the car structure, the ends of all the tubes meeting at a joint were milled to fit snugly before welding and all welding was done with the structure in a massive jig. The gear cases also are of welded construction.' (p. 537) The Tow Tank was designed by Starr Truscott. Construction was authorized in March 1929 and was dedicated on May 27, 1931 during the Annual Manufacturer's Inspection meeting. The tank's initial cost was $649,000. It was used by the NACA to study hydrodynamics problems until 1959 when the facility was turned over to the U.S. Navy. The tank was extended to 2960 in 1936. In addition to increasing the length of the tank, a new high-speed (80-mph) carriage was installed in 1936-1937.Construction of pontoon hull model to be tested in Tow Tank No. 1. This is a 1/6 full size model of the hull of a Navy PH-1 flying boat. Two typical tests were conducted and included in the first report describing the High-Speed Towing Basin. From TR 470: 'A survey of the information available regarding the application of the results of tests of models in towing basins to the design of floats for seaplanes was made by the National Advisory Committee for Aeronautics in 1929. It was found that the development of flying boats and seaplanes had been assisted very much in the United States, and possibly more in other countries, by tests of models in towing basins or tanks. Some tanks already existed which were designed especially for testing models of seaplane floats and the construction of other tanks for this special purpose was projected. There was no such tank in the United States; in fact, there were only two tanks, both constructed before the appearance of the seaplane and designed originally to test

  5. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  6. Interfacial Micromechanics in Fibrous Composites: Design, Evaluation, and Models

    PubMed Central

    Lei, Zhenkun; Li, Xuan; Qin, Fuyong; Qiu, Wei

    2014-01-01

    Recent advances of interfacial micromechanics in fiber reinforced composites using micro-Raman spectroscopy are given. The faced mechanical problems for interface design in fibrous composites are elaborated from three optimization ways: material, interface, and computation. Some reasons are depicted that the interfacial evaluation methods are difficult to guarantee the integrity, repeatability, and consistency. Micro-Raman study on the fiber interface failure behavior and the main interface mechanical problems in fibrous composites are summarized, including interfacial stress transfer, strength criterion of interface debonding and failure, fiber bridging, frictional slip, slip transition, and friction reloading. The theoretical models of above interface mechanical problems are given. PMID:24977189

  7. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  8. Interfacial micromechanics in fibrous composites: design, evaluation, and models.

    PubMed

    Lei, Zhenkun; Li, Xuan; Qin, Fuyong; Qiu, Wei

    2014-01-01

    Recent advances of interfacial micromechanics in fiber reinforced composites using micro-Raman spectroscopy are given. The faced mechanical problems for interface design in fibrous composites are elaborated from three optimization ways: material, interface, and computation. Some reasons are depicted that the interfacial evaluation methods are difficult to guarantee the integrity, repeatability, and consistency. Micro-Raman study on the fiber interface failure behavior and the main interface mechanical problems in fibrous composites are summarized, including interfacial stress transfer, strength criterion of interface debonding and failure, fiber bridging, frictional slip, slip transition, and friction reloading. The theoretical models of above interface mechanical problems are given. PMID:24977189

  9. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    USGS Publications Warehouse

    Edwards, T.C.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, G.G.

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.

  10. Designing a model to minimize inequities in hemodialysis facilities distribution.

    PubMed

    Salgado, Teresa M; Moles, Rebekah; Benrimoj, Shalom I; Fernandez-Llimos, Fernando

    2011-11-01

    Portugal has an uneven, city-centered bias in the distribution of hemodialysis centers found to contribute to health care inequities. A model has been developed with the aim of minimizing access inequity through the identification of the best possible localization of new hemodialysis facilities. The model was designed under the assumption that individuals from different geographic areas, ceteris paribus, present the same likelihood of requiring hemodialysis in the future. Distances to reach the closest hemodialysis facility were calculated for every municipality lacking one. Regions were scored by aggregating weights of the "individual burden", defined as the burden for an individual living in a region lacking a hemodialysis center to reach one as often as needed, and the "population burden", defined as the burden for the total population living in such a region. The model revealed that the average travelling distance for inhabitants in municipalities without a hemodialysis center is 32 km and that 145,551 inhabitants (1.5%) live more than 60 min away from a hemodialysis center, while 1,393,770 (13.8%) live 30-60 min away. Multivariate analysis showed that the current localization of hemodialysis facilities is associated with major urban areas. The model developed recommends 12 locations for establishing hemodialysis centers that would result in drastically reduced travel for 34 other municipalities, leaving only six (34,800 people) with over 60 min of travel. The application of this model should facilitate the planning of future hemodialysis services as it takes into consideration the potential impact of travel time for individuals in need of dialysis, as well as the logistic arrangements required to transport all patients with end-stage renal disease. The model is applicable in any country and health care planners can opt to weigh these two elements differently in the model according to their priorities. PMID:22109858

  11. Using the Continuum of Design Modelling Techniques to Aid the Development of CAD Modeling Skills in First Year Industrial Design Students

    ERIC Educational Resources Information Center

    Storer, I. J.; Campbell, R. I.

    2012-01-01

    Industrial Designers need to understand and command a number of modelling techniques to communicate their ideas to themselves and others. Verbal explanations, sketches, engineering drawings, computer aided design (CAD) models and physical prototypes are the most commonly used communication techniques. Within design, unlike some disciplines,…

  12. The design of the Model V transmission fluorimeter

    USGS Publications Warehouse

    Fletcher, Mary H.; May, Irving; Anderson, Joseph W.

    1950-01-01

    The transmission fluorimeter for the measurement of the fluorescence of uranium in fluoride melts is described. The instrument incorporates several improved features which have not been published previously. Unlike the earliest models, the design of the new fluorimeter, with its close machining of parts, reduces the possibility of light leakage and also increases considerably the ease with which the various components of the instrument may be assembled and adjusted. The Model V fluorimeter is a very rugged instrument with a compact arrangement of parts. It possess great flexibility so that various phototubes, measuring devices, light sources, and filter combinations may be used interchangeably. Detailed shop drawings are given for the construction of the fluorimeter.

  13. Designing a training tool for imaging mental models

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  14. Response Surface Model Building and Multidisciplinary Optimization Using D-Optimal Designs

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Lepsch, Roger A.; McMillin, Mark L.

    1998-01-01

    This paper discusses response surface methods for approximation model building and multidisciplinary design optimization. The response surface methods discussed are central composite designs, Bayesian methods and D-optimal designs. An over-determined D-optimal design is applied to a configuration design and optimization study of a wing-body, launch vehicle. Results suggest that over determined D-optimal designs may provide an efficient approach for approximation model building and for multidisciplinary design optimization.

  15. Multilevel models for repeated measures research designs in psychophysiology: an introduction to growth curve modeling.

    PubMed

    Kristjansson, Sean D; Kircher, John C; Webb, Andrea K

    2007-09-01

    Psychophysiologists often use repeated measures analysis of variance (RMANOVA) and multivariate analysis of variance (MANOVA) to analyze data collected in repeated measures research designs. ANOVA and MANOVA are nomothetic approaches that focus on group means. Newer multilevel modeling techniques are more informative than ANOVA because they characterize both group-level (nomothetic) and individual-level (idiographic) effects, yielding a more complete understanding of the phenomena under study. This article was written as an introduction to growth curve modeling for applied researchers. A growth model is defined that can be used in place of RMANOVAs and MANOVAs for single-group and mixed repeated measures designs. The model is expanded to test and control for the effects of baseline levels of physiological activity on stimulus-specific responses. Practical, conceptual, and statistical advantages of growth curve modeling are discussed. PMID:17596179

  16. Sequence design in lattice models by graph theoretical methods

    NASA Astrophysics Data System (ADS)

    Sanjeev, B. S.; Patra, S. M.; Vishveshwara, S.

    2001-01-01

    A general strategy has been developed based on graph theoretical methods, for finding amino acid sequences that take up a desired conformation as the native state. This problem of inverse design has been addressed by assigning topological indices for the monomer sites (vertices) of the polymer on a 3×3×3 cubic lattice. This is a simple design strategy, which takes into account only the topology of the target protein and identifies the best sequence for a given composition. The procedure allows the design of a good sequence for a target native state by assigning weights for the vertices on a lattice site in a given conformation. It is seen across a variety of conformations that the predicted sequences perform well both in sequence and in conformation space, in identifying the target conformation as native state for a fixed composition of amino acids. Although the method is tested in the framework of the HP model [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] it can be used in any context if proper potential functions are available, since the procedure derives unique weights for all the sites (vertices, nodes) of the polymer chain of a chosen conformation (graph).

  17. Missile guidance law design using adaptive cerebellar model articulation controller.

    PubMed

    Lin, Chih-Min; Peng, Ya-Fu

    2005-05-01

    An adaptive cerebellar model articulation controller (CMAC) is proposed for command to line-of-sight (CLOS) missile guidance law design. In this design, the three-dimensional (3-D) CLOS guidance problem is formulated as a tracking problem of a time-varying nonlinear system. The adaptive CMAC control system is comprised of a CMAC and a compensation controller. The CMAC control is used to imitate a feedback linearization control law and the compensation controller is utilized to compensate the difference between the feedback linearization control law and the CMAC control. The online adaptive law is derived based on the Lyapunov stability theorem to learn the weights of receptive-field basis functions in CMAC control. In addition, in order to relax the requirement of approximation error bound, an estimation law is derived to estimate the error bound. Then the adaptive CMAC control system is designed to achieve satisfactory tracking performance. Simulation results for different engagement scenarios illustrate the validity of the proposed adaptive CMAC-based guidance law.

  18. Design and modeling of a prototype fiber scanning CARS endoscope

    NASA Astrophysics Data System (ADS)

    Veilleux, Isra"l.; Doucet, Michel; Coté, Patrice; Verreault, Sonia; Fortin, Michel; Paradis, Patrick; Leclair, Sébastien; Da Costa, Ralph S.; Wilson, Brian C.; Seibel, Eric; Mermut, Ozzy; Cormier, Jean-François

    2010-02-01

    An endoscope capable of Coherent Anti-Stokes Raman scattering (CARS) imaging would be of significant clinical value for improving early detection of endoluminal cancers. However, developing this technology is challenging for many reasons. First, nonlinear imaging techniques such as CARS are single point measurements thus requiring fast scanning in a small footprint if video rate is to be achieved. Moreover, the intrinsic nonlinearity of this modality imposes several technical constraints and limitations, mainly related to pulse and beam distortions that occur within the optical fiber and the focusing objective. Here, we describe the design and report modeling results of a new CARS endoscope. The miniature microscope objective design and its anticipated performance are presented, along with its compatibility with a new spiral scanningfiber imaging technology developed at the University of Washington. This technology has ideal attributes for clinical use, with its small footprint, adjustable field-of-view and high spatial-resolution. This compact hybrid fiber-based endoscopic CARS imaging design is anticipated to have a wide clinical applicability.

  19. Using Modeling to Design new Rheology Modifiers for Paints

    NASA Astrophysics Data System (ADS)

    Ginzburg, Valeriy

    2013-03-01

    Since their invention in 1970-s, hydrophobically ethoxylated urethanes (HEUR) have been actively used as rheology modifiers for paints. Thermodynamic and rheological behavior of HEUR molecules in aqueous solutions is now very well understood and is based on the concept of transient network (TN), where the association of hydrophobic groups into networks of flower micelles causes viscosity to increase dramatically as function of polymer concentration. The behavior of complex mixtures containing water, HEUR, and latex (``binder'') particles, however, is understood less well, even though it has utmost importance in the paint formulation design. In this talk, we discuss how the adsorption of HEUR chains onto latex particles results in formation of complex viscoelastic networks with temporary bridges between particles. We then utilize Self-Consistent Field Theory (SCFT) model to compute effective adsorption isotherms (thickener-on-latex) and develop a rheological theory describing steady-shear viscosity of such mixtures. The model is able to qualitatively describe many important features of the water/latex/HEUR mixtures, such as strong shear thinning. The proposed approach could potentially lead to the design of new HEUR structures with improved rheological performance. This work was supported by Dow Chemical Company

  20. Robust design and model validation of nonlinear compliant micromechanisms.

    SciTech Connect

    Howell, Larry L.; Baker, Michael Sean; Wittwer, Jonathan W.

    2005-02-01

    Although the use of compliance or elastic flexibility in microelectromechanical systems (MEMS) helps eliminate friction, wear, and backlash, compliant MEMS are known to be sensitive to variations in material properties and feature geometry, resulting in large uncertainties in performance. This paper proposes an approach for design stage uncertainty analysis, model validation, and robust optimization of nonlinear MEMS to account for critical process uncertainties including residual stress, layer thicknesses, edge bias, and material stiffness. A fully compliant bistable micromechanism (FCBM) is used as an example, demonstrating that the approach can be used to handle complex devices involving nonlinear finite element models. The general shape of the force-displacement curve is validated by comparing the uncertainty predictions to measurements obtained from in situ force gauges. A robust design is presented, where simulations show that the estimated force variation at the point of interest may be reduced from {+-}47 {micro}N to {+-}3 {micro}N. The reduced sensitivity to process variations is experimentally validated by measuring the second stable position at multiple locations on a wafer.

  1. Improved transcranial magnetic stimulation coil design with realistic head modeling

    NASA Astrophysics Data System (ADS)

    Crowther, Lawrence; Hadimani, Ravi; Jiles, David

    2013-03-01

    We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.

  2. Design and testing of a model CELSS chamber robot

    NASA Technical Reports Server (NTRS)

    Davis, Mark; Dezego, Shawn; Jones, Kinzy; Kewley, Christopher; Langlais, Mike; Mccarthy, John; Penny, Damon; Bonner, Tom; Funderburke, C. Ashley; Hailey, Ruth

    1994-01-01

    A robot system for use in an enclosed environment was designed and tested. The conceptual design will be used to assist in research performed by the Controlled Ecological Life Support System (CELSS) project. Design specifications include maximum load capacity, operation at specified environmental conditions, low maintenance, and safety. The robot system must not be hazardous to the sealed environment, and be capable of stowing and deploying within a minimum area of the CELSS chamber facility. This design consists of a telescoping robot arm that slides vertically on a shaft positioned in the center of the CELSS chamber. The telescoping robot arm consists of a series of links which can be fully extended to a length equal to the radius of the working envelope of the CELSS chamber. The vertical motion of the robot arm is achieved through the use of a combination ball screw/ball spline actuator system. The robot arm rotates cylindrically about the vertical axis through use of a turntable bearing attached to a central mounting structure fitted to the actuator shaft. The shaft is installed in an overhead rail system allowing the entire structure to be stowed and deployed within the CELSS chamber. The overhead rail system is located above the chamber's upper lamps and extends to the center of the CELSS chamber. The mounting interface of the actuator shaft and rail system allows the entire actuator shaft to be detached and removed from the CELSS chamber. When the actuator shaft is deployed, it is held fixed at the bottom of the chamber by placing a square knob on the bottom of the shaft into a recessed square fitting in the bottom of the chamber floor. A support boot ensures the rigidity of the shaft. Three student teams combined into one group designed a model of the CELSS chamber robot that they could build. They investigated materials, availability, and strength in their design. After the model arm and stand were built, the class performed pre-tests on the entire system

  3. A reference model for model-based design of critical infrastructure protection systems

    NASA Astrophysics Data System (ADS)

    Shin, Young Don; Park, Cheol Young; Lee, Jae-Chon

    2015-05-01

    Today's war field environment is getting versatile as the activities of unconventional wars such as terrorist attacks and cyber-attacks have noticeably increased lately. The damage caused by such unconventional wars has also turned out to be serious particularly if targets are critical infrastructures that are constructed in support of banking and finance, transportation, power, information and communication, government, and so on. The critical infrastructures are usually interconnected to each other and thus are very vulnerable to attack. As such, to ensure the security of critical infrastructures is very important and thus the concept of critical infrastructure protection (CIP) has come. The program to realize the CIP at national level becomes the form of statute in each country. On the other hand, it is also needed to protect each individual critical infrastructure. The objective of this paper is to study on an effort to do so, which can be called the CIP system (CIPS). There could be a variety of ways to design CIPS's. Instead of considering the design of each individual CIPS, a reference model-based approach is taken in this paper. The reference model represents the design of all the CIPS's that have many design elements in common. In addition, the development of the reference model is also carried out using a variety of model diagrams. The modeling language used therein is the systems modeling language (SysML), which was developed and is managed by Object Management Group (OMG) and a de facto standard. Using SysML, the structure and operational concept of the reference model are designed to fulfil the goal of CIPS's, resulting in the block definition and activity diagrams. As a case study, the operational scenario of the nuclear power plant while being attacked by terrorists is studied using the reference model. The effectiveness of the results is also analyzed using multiple analysis models. It is thus expected that the approach taken here has some merits

  4. Modeling the Benchmark Active Control Technology Wind-Tunnel Model for Active Control Design Applications

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1998-01-01

    This report describes the formulation of a model of the dynamic behavior of the Benchmark Active Controls Technology (BACT) wind tunnel model for active control design and analysis applications. The model is formed by combining the equations of motion for the BACT wind tunnel model with actuator models and a model of wind tunnel turbulence. The primary focus of this report is the development of the equations of motion from first principles by using Lagrange's equations and the principle of virtual work. A numerical form of the model is generated by making use of parameters obtained from both experiment and analysis. Comparisons between experimental and analytical data obtained from the numerical model show excellent agreement and suggest that simple coefficient-based aerodynamics are sufficient to accurately characterize the aeroelastic response of the BACT wind tunnel model. The equations of motion developed herein have been used to aid in the design and analysis of a number of flutter suppression controllers that have been successfully implemented.

  5. Exploring the Model Design Space for Battery Health Management

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Quach, Cuong Chi; Goebel, Kai Frank

    2011-01-01

    Battery Health Management (BHM) is a core enabling technology for the success and widespread adoption of the emerging electric vehicles of today. Although battery chemistries have been studied in detail in literature, an accurate run-time battery life prediction algorithm has eluded us. Current reliability-based techniques are insufficient to manage the use of such batteries when they are an active power source with frequently varying loads in uncertain environments. The amount of usable charge of a battery for a given discharge profile is not only dependent on the starting state-of-charge (SOC), but also other factors like battery health and the discharge or load profile imposed. This paper presents a Particle Filter (PF) based BHM framework with plug-and-play modules for battery models and uncertainty management. The batteries are modeled at three different levels of granularity with associated uncertainty distributions, encoding the basic electrochemical processes of a Lithium-polymer battery. The effects of different choices in the model design space are explored in the context of prediction performance in an electric unmanned aerial vehicle (UAV) application with emulated flight profiles.

  6. Freezable Radiator Model Correlation and Full Scale Design

    NASA Technical Reports Server (NTRS)

    Lillibridge, Sean T.; Navarro, Moses

    2010-01-01

    Freezable radiators offer an attractive solution to the issue of thermal control system scalability. As thermal environments change, a freezable radiator will effectively scale the total heat rejection it is capable of as a function of the thermal environment and flow rate through the radiator. Scalable thermal control systems are a critical technology for spacecraft that will endure missions with widely varying thermal requirements. These changing requirements are a result of the space craft s surroundings and because of different thermal loads during different mission phases. However, freezing and thawing (recovering) a radiator is a process that has historically proven very difficult to predict through modeling, resulting in highly inaccurate predictions of recovery time. This paper summarizes efforts made to correlate a Thermal Desktop (TM) model with empirical testing data from two test articles. A 50-50 mixture of DowFrost HD and water is used as the working fluid. Efforts to scale this model to a full scale design, as well as efforts to characterize various thermal control fluids at low temperatures are also discussed.

  7. Planetary Boundary-Layer Modelling and Tall Building Design

    NASA Astrophysics Data System (ADS)

    Simiu, Emil; Shi, Liang; Yeo, DongHun

    2016-04-01

    Characteristics of flow in the planetary boundary layer (PBL) strongly affect the design of tall structures. PBL modelling in building codes, based as it is on empirical data from the 1960s and 1970s, differs significantly from contemporary PBL models, which account for both "neutral" flows, and "conventionally neutral" flows. PBL heights estimated in these relatively sophisticated models are typically approximately half as large as those obtained using the classical asymptotic similarity approach, and are one order of magnitude larger than those specified in North American and Japanese building codes. A simple method is proposed for estimating the friction velocity and PBL height as functions of specified surface roughness and geostrophic wind speed. Based on published results, it is tentatively determined that, even at elevations as high as 800 m above the surface, the contribution to the resultant mean flow velocity of the component V normal to the surface stress is negligible and the veering angle is of the order of only 5°. This note aims to encourage dialogue between boundary-layer meteorologists and structural engineers.

  8. Light driven microactuators: Design, fabrication, and mathematical modeling

    NASA Astrophysics Data System (ADS)

    Han, Li-Hsin

    This dissertation is concerned with design, fabrication, and mathematical modeling of three different microactuators driven by light. Compared to electricity, electromagnetic wave is a wireless source of power. A distant light source can be delivered, absorbed, and converted to generate a driving force for a microactuator. The study of light-driven microsystems, still at its early stage, is already expanding the horizon for the research of microsystems. The microactuators of this dissertation include micro-cantilevers driven by pulsed laser, photo-deformable microshells coated with gold nanospheres, and a nano-particles coated micro-turbine driven by visible light. Experimental investigation and theoretical analysis of these microactuators showed interesting results. These microactuators were functioned based on cross-linked, multiple physics phenomenon, such as photo-heating, thermal expansion, photo-chemistry effect, plasomonics enhancement, and thermal convection in rarefied gas. These multiple physics effects dominate the function of a mechanical system, when the system size becomes small. The modeling results of the microactuators suggest that, to simulate a microscale mechanical system accurately, one has to take account the minimum dimension of the system and to consider the validity of a theoretical model. Examples of the building of different microstructures were shown to demonstrate the capacity of a digital-micromirror-device (DMD) based apparatus for three-dimensional, heterogeneous fabrication of polymeric microstructures.

  9. Unified control/structure design and modeling research

    NASA Technical Reports Server (NTRS)

    Mingori, D. L.; Gibson, J. S.; Blelloch, P. A.; Adamian, A.

    1986-01-01

    To demonstrate the applicability of the control theory for distributed systems to large flexible space structures, research was focused on a model of a space antenna which consists of a rigid hub, flexible ribs, and a mesh reflecting surface. The space antenna model used is discussed along with the finite element approximation of the distributed model. The basic control problem is to design an optimal or near-optimal compensator to suppress the linear vibrations and rigid-body displacements of the structure. The application of an infinite dimensional Linear Quadratic Gaussian (LQG) control theory to flexible structure is discussed. Two basic approaches for robustness enhancement were investigated: loop transfer recovery and sensitivity optimization. A third approach synthesized from elements of these two basic approaches is currently under development. The control driven finite element approximation of flexible structures is discussed. Three sets of finite element basic vectors for computing functional control gains are compared. The possibility of constructing a finite element scheme to approximate the infinite dimensional Hamiltonian system directly, instead of indirectly is discussed.

  10. Iso2k: A community-driven effort to develop a global database of paleo-water isotopes covering the past two millennia

    NASA Astrophysics Data System (ADS)

    Konecky, B. L.; Partin, J. W.

    2015-12-01

    Iso2k is a new, community-based effort within the Past Global Changes 2k (PAGES2k) project to investigate decadal to centennial-scale variability in hydroclimate over the past 2,000 years. This PAGES2K Trans-Regional Project will create a global database, available for public use, of archives that record the stable isotopic composition of water (δ18O and δD). Stable water isotopes detect regional-scale circulation patterns, making them excellent tracers of the water cycle's response to changes in climate. Researchers will use the database to identify regional- and global-scale features in hydroclimate and atmospheric circulation during the past 2kyr and their relationship with temperature reconstructions. Other key science questions to be addressed include: How do water isotope proxy records capture changes in the tropical water cycle? Where and how well do the modern day temperature-hydrology relationships hold over the last 2k? How do changes in atmospheric conditions relate to changes in oceanic conditions? The Iso2k database will be also be a valuable tool for the wider community, including those researching such topics as isotope-enabled climate model simulations and proxy system modeling. To facilitate broad use of the database, experts in various proxy archive types developed metadata fields to encode the information needed to accurately and systematically interpret isotope ratios. Proxy records in the database are derived from many archives, including corals, ice cores, fossil groundwater, speleothems, tree ring cellulose, and marine and lacustrine sediments. Annually-banded records have a minimum duration of 30 years, and low-resolution records have a minimum duration of 200 years and at least 5 data points during the past 2kyr. Chronological accuracy standards follow those of the PAGES2k temperature database. Datasets are publicly archived, although unpublished data is accepted in some circumstances. Ultimately, isotope records will be integrated into

  11. Applying Contamination Modelling to Spacecraft Propulsion Systems Designs and Operations

    NASA Technical Reports Server (NTRS)

    Chen, Philip T.; Thomson, Shaun; Woronowicz, Michael S.

    2000-01-01

    Molecular and particulate contaminants generated from the operations of a propulsion system may impinge on spacecraft critical surfaces. Plume depositions or clouds may hinder the spacecraft and instruments from performing normal operations. Firing thrusters will generate both molecular and particulate contaminants. How to minimize the contamination impact from the plume becomes very critical for a successful mission. The resulting effect from either molecular or particulate contamination of the thruster firing is very distinct. This paper will discuss the interconnection between the functions of spacecraft contamination modeling and propulsion system implementation. The paper will address an innovative contamination engineering approach implemented from the spacecraft concept design, manufacturing, integration and test (I&T), launch, to on- orbit operations. This paper will also summarize the implementation on several successful missions. Despite other contamination sources, only molecular contamination will be considered here.

  12. Adiabatic model and design of a translating field reversed configuration

    SciTech Connect

    Intrator, T. P.; Siemon, R. E.; Sieck, P. E.

    2008-04-15

    We apply an adiabatic evolution model to predict the behavior of a field reversed configuration (FRC) during decompression and translation, as well as during boundary compression. Semi-empirical scaling laws, which were developed and benchmarked primarily for collisionless FRCs, are expected to remain valid even for the collisional regime of FRX-L experiment. We use this approach to outline the design implications for FRX-L, the high density translated FRC experiment at Los Alamos National Laboratory. A conical theta coil is used to accelerate the FRC to the largest practical velocity so it can enter a mirror bounded compression region, where it must be a suitable target for a magnetized target fusion (MTF) implosion. FRX-L provides the physics basis for the integrated MTF plasma compression experiment at the Shiva-Star pulsed power facility at Kirtland Air Force Research Laboratory, where the FRC will be compressed inside a flux conserving cylindrical shell.

  13. A physiological model for extracorporeal oxygenation controller design.

    PubMed

    Walter, Marian; Weyer, Soren; Stollenwerk, Andre; Kopp, Rudger; Arens, Jutta; Leonhardt, Steffen

    2010-01-01

    Long term extracorporeal membrane oxygenation can be used in cases of severe lung failure to maintain sufficient gas exchange without the need to apply higher ventilation pressures which damage the lung additionally. The use of cardiopulmonary bypass devices is well established inside the operating room. The usage of such devices as long-term support in the intensive care unit is still experimental and limited to few cases. This is because neither machine architecture nor staff situation provides for the long term application scenario. In the joint research Project "smart ECLA" we target an advanced ECMO device featuring an automation system capable of maintaining gas concentrations automatically. One key requirement for systematic controller design is the availability of a process model, which will be presented in this article. PMID:21096765

  14. Modeling and design of an X-band rf photoinjector

    NASA Astrophysics Data System (ADS)

    Marsh, R. A.; Albert, F.; Anderson, S. G.; Beer, G.; Chu, T. S.; Cross, R. R.; Deis, G. A.; Ebbers, C. A.; Gibson, D. J.; Houck, T. L.; Hartemann, F. V.; Barty, C. P. J.; Candel, A.; Jongewaard, E. N.; Li, Z.; Limborg-Deprey, C.; Vlieks, A. E.; Wang, F.; Wang, J. W.; Zhou, F.; Adolphsen, C.; Raubenheimer, T. O.

    2012-10-01

    A design for an X-band rf photoinjector that was developed jointly by SLAC National Accelerator Laboratory (SLAC) and Lawrence Livermore National Laboratory (LLNL) is presented. The photoinjector is based around a 5.59 cell rf gun that has state-of-the-art features including: elliptical contoured irises; improved mode separation; an optimized initial half cell length; a racetrack input coupler; and coupling that balances pulsed heating with cavity fill time. Radio-frequency and beam dynamics modeling have been done using a combination of codes including PARMELA, HFSS, IMPACT-T, ASTRA, and the ACE3P suite of codes developed at SLAC. The impact of lower gradient operation, magnet misalignment, solenoid multipole errors, beam offset, mode beating, wakefields, and beam line symmetry have been analyzed and are described. Fabrication and testing plans at both LLNL and SLAC are discussed.

  15. Turning statistical physics models into materials design engines

    PubMed Central

    Miskin, Marc Z.; Khaira, Gurdaman; de Pablo, Juan J.; Jaeger, Heinrich M.

    2016-01-01

    Despite the success statistical physics has enjoyed at predicting the properties of materials for given parameters, the inverse problem, identifying which material parameters produce given, desired properties, is only beginning to be addressed. Recently, several methods have emerged across disciplines that draw upon optimization and simulation to create computer programs that tailor material responses to specified behaviors. However, so far the methods developed either involve black-box techniques, in which the optimizer operates without explicit knowledge of the material’s configuration space, or require carefully tuned algorithms with applicability limited to a narrow subclass of materials. Here we introduce a formalism that can generate optimizers automatically by extending statistical mechanics into the realm of design. The strength of this approach lies in its capability to transform statistical models that describe materials into optimizers to tailor them. By comparing against standard black-box optimization methods, we demonstrate how optimizers generated by this formalism can be faster and more effective, while remaining straightforward to implement. The scope of our approach includes possibilities for solving a variety of complex optimization and design problems concerning materials both in and out of equilibrium. PMID:26684770

  16. Design, modelling, implementation, and intelligent fuzzy control of a hovercraft

    NASA Astrophysics Data System (ADS)

    El-khatib, M. M.; Hussein, W. M.

    2011-05-01

    A Hovercraft is an amphibious vehicle that hovers just above the ground or water by air cushion. The concept of air cushion vehicle can be traced back to 1719. However, the practical form of hovercraft nowadays is traced back to 1955. The objective of the paper is to design, simulate and implement an autonomous model of a small hovercraft equipped with a mine detector that can travel over any terrains. A real time layered fuzzy navigator for a hovercraft in a dynamic environment is proposed. The system consists of a Takagi-Sugenotype fuzzy motion planner and a modified proportional navigation based fuzzy controller. The system philosophy is inspired by human routing when moving between obstacles based on visual information including the right and left views from which he makes his next step towards the goal in the free space. It intelligently combines two behaviours to cope with obstacle avoidance as well as approaching a goal using a proportional navigation path accounting for hovercraft kinematics. MATLAB/Simulink software tool is used to design and verify the proposed algorithm.

  17. A 2D model to design MHD induction pumps

    NASA Astrophysics Data System (ADS)

    Stieglitz, R.; Zeininger, J.

    2006-09-01

    Technical liquid metal systems accompanied by a thermal transfer of energy such as reactor systems, metallurgical processes, metal refinement, casting, etc., require a forced convection of the fluid. The increased temperatures and more often the environmental conditions as, e.g., in a nuclear environment, pumping principles are required, in which rotating parts are absent. Additionally, in many applications a controlled atmosphere is indispensable, in order to ensure the structural integrity of the duct walls. An interesting option to overcome the sealing problem of a mechanical pump towards the surrounding is offered by induction systems. Although their efficiency compared to that of turbo machines is quite low, they have several advantages, which are attractive to the specific requirements in liquid metal applications such as: - low maintenance costs due to the absence of sealings, bearings and moving parts; - low degradation rate of the structural material; - simple replacement of the inductor without cut of the piping system; - fine regulation of flow rate by different inductor connections; - change of pump characteristics without change of the mechanical set-up. Within the article, general design requirements of electromagnetic pumps (EMP) are elaborated. The design of two annular linear induction pumps operating with sodium and lead-bismuth are presented and the calculated pump characteristics and experimentally obtained data are compared. In this context, physical effects leading to deviations between the model and the real data are addressed. Finally, the main results are summarized. Tables 4, Figs 4, Refs 12.

  18. Smart health monitoring systems: an overview of design and modeling.

    PubMed

    Baig, Mirza Mansoor; Gholamhosseini, Hamid

    2013-04-01

    Health monitoring systems have rapidly evolved during the past two decades and have the potential to change the way health care is currently delivered. Although smart health monitoring systems automate patient monitoring tasks and, thereby improve the patient workflow management, their efficiency in clinical settings is still debatable. This paper presents a review of smart health monitoring systems and an overview of their design and modeling. Furthermore, a critical analysis of the efficiency, clinical acceptability, strategies and recommendations on improving current health monitoring systems will be presented. The main aim is to review current state of the art monitoring systems and to perform extensive and an in-depth analysis of the findings in the area of smart health monitoring systems. In order to achieve this, over fifty different monitoring systems have been selected, categorized, classified and compared. Finally, major advances in the system design level have been discussed, current issues facing health care providers, as well as the potential challenges to health monitoring field will be identified and compared to other similar systems.

  19. Molecular modeling in the design of peptidomimetics and peptide surrogates.

    PubMed

    Perez, Juan J; Corcho, Francesc; Llorens, Oriol

    2002-12-01

    The most important natural sources of new leads are plant extracts, bacterial broths, animal venoms and peptides isolated from living organisms. However, only the three first have been used extensively in the development of new therapeutic agents. This is probably due to the low pharmacological profile exhibited by peptides, that requires a lengthy transformation to make them suitable as new leads. In contrast, bioactive compounds isolated from the other sources are regularly closer to be used as lead compounds. Nevertheless, the sources for compounds of this category are nowadays scarce. In contrast, there are new bioactive peptides discovered quite often and reported as ligands for different receptors. Under these circumstances peptides appear as an attractive source of prospective new leads. In order to reduce the time involved in the design of a potential lead from a peptide, molecular modeling tools have been developed in the last few years. The purpose of the present work is to review the different techniques available and to report various successful examples of design of new peptidomimetics published in the literature.

  20. Turning statistical physics models into materials design engines.

    PubMed

    Miskin, Marc Z; Khaira, Gurdaman; de Pablo, Juan J; Jaeger, Heinrich M

    2016-01-01

    Despite the success statistical physics has enjoyed at predicting the properties of materials for given parameters, the inverse problem, identifying which material parameters produce given, desired properties, is only beginning to be addressed. Recently, several methods have emerged across disciplines that draw upon optimization and simulation to create computer programs that tailor material responses to specified behaviors. However, so far the methods developed either involve black-box techniques, in which the optimizer operates without explicit knowledge of the material's configuration space, or require carefully tuned algorithms with applicability limited to a narrow subclass of materials. Here we introduce a formalism that can generate optimizers automatically by extending statistical mechanics into the realm of design. The strength of this approach lies in its capability to transform statistical models that describe materials into optimizers to tailor them. By comparing against standard black-box optimization methods, we demonstrate how optimizers generated by this formalism can be faster and more effective, while remaining straightforward to implement. The scope of our approach includes possibilities for solving a variety of complex optimization and design problems concerning materials both in and out of equilibrium.

  1. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    PubMed

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  2. A modular approach to addressing model design, scale, and parameter estimation issues in distributed hydrological modelling

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Restrepo, P.J.; Viger, R.J.

    2002-01-01

    A modular approach to model design and construction provides a flexible framework in which to focus the multidisciplinary research and operational efforts needed to facilitate the development, selection, and application of the most robust distributed modelling methods. A variety of modular approaches have been developed, but with little consideration for compatibility among systems and concepts. Several systems are proprietary, limiting any user interaction. The US Geological Survey modular modelling system (MMS) is a modular modelling framework that uses an open source software approach to enable all members of the scientific community to address collaboratively the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. Implementation of a common modular concept is not a trivial task. However, it brings the resources of a larger community to bear on the problems of distributed modelling, provides a framework in which to compare alternative modelling approaches objectively, and provides a means of sharing the latest modelling advances. The concepts and components of the MMS are described and an example application of the MMS, in a decision-support system context, is presented to demonstrate current system capabilities. Copyright ?? 2002 John Wiley and Sons, Ltd.

  3. Linear Equating for the NEAT Design: Parameter Substitution Models and Chained Linear Relationship Models

    ERIC Educational Resources Information Center

    Kane, Michael T.; Mroch, Andrew A.; Suh, Youngsuk; Ripkey, Douglas R.

    2009-01-01

    This paper analyzes five linear equating models for the "nonequivalent groups with anchor test" (NEAT) design with internal anchors (i.e., the anchor test is part of the full test). The analysis employs a two-dimensional framework. The first dimension contrasts two general approaches to developing the equating relationship. Under a "parameter…

  4. Spatial distribution of bacterial communities driven by multiple environmental factors in a beach wetland of the largest freshwater lake in China

    PubMed Central

    Ding, Xia; Peng, Xiao-Jue; Jin, Bin-Song; Xiao, Ming; Chen, Jia-Kuan; Li, Bo; Fang, Chang-Ming; Nie, Ming

    2015-01-01

    The spatial distributions of bacterial communities may be driven by multiple environmental factors. Thus, understanding the relationships between bacterial distribution and environmental factors is critical for understanding wetland stability and the functioning of freshwater lakes. However, little research on the bacterial communities in deep sediment layers exists. In this study, thirty clone libraries of 16S rRNA were constructed from a beach wetland of the Poyang Lake along both horizontal (distance to the water-land junction) and vertical (sediment depth) gradients to assess the effects of sediment properties on bacterial community structure and diversity. Our results showed that bacterial diversity increased along the horizontal gradient and decreased along the vertical gradient. The heterogeneous sediment properties along gradients substantially affected the dominant bacterial groups at the phylum and species levels. For example, the NH+4 concentration decreased with increasing depth, which was positively correlated with the relative abundance of Alphaproteobacteria. The changes in bacterial diversity and dominant bacterial groups showed that the top layer had a different bacterial community structure than the deeper layers. Principal component analysis revealed that both gradients, not each gradient independently, contributed to the shift in the bacterial community structure. A multiple linear regression model explained the changes in bacterial diversity and richness along the depth and distance gradients. Overall, our results suggest that spatial gradients associated with sediment properties shaped the bacterial communities in the Poyang Lake beach wetland. PMID:25767466

  5. Computational Human Performance Modeling For Alarm System Design

    SciTech Connect

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  6. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  7. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1991-01-01

    Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations.

  8. Vertical stratification of bacterial communities driven by multiple environmental factors in the waters (0-5000 m) off the Galician coast (NW Iberian margin)

    NASA Astrophysics Data System (ADS)

    Dobal-Amador, Vladimir; Nieto-Cid, Mar; Guerrero-Feijoo, Elisa; Hernando-Morales, Victor; Teira, Eva; Varela-Rozados, Marta M.

    2016-08-01

    The processes mediated by microbial planktonic communities occur along the entire water column, yet the microbial activity and composition have been studied mainly in surface waters. This research examined the vertical variation in bacterial abundance, activity and community composition and structure from surface down to 5000 m depth following a longitudinal transect off the Galician coast (NW Iberian margin, from 43°N, 9°W to 43°N, 15°W). Community activity and composition changed with depth. The leucine incorporation rates decreased from the euphotic layer to the bathypelagic waters by three orders of magnitude, whereas prokaryotic abundance decreased only by one order of magnitude. The relative abundance of SAR11 and Alteromonas, determined by catalyzed reported deposition fluorescence in situ hybridization (CARD-FISH), decreased with depth. Meanwhile, the contribution of SAR 202 and SAR324 was significantly higher in the deeper layers (i.e. NEADW, North East Atlantic Deep Water and LDW, Lower Deep Water) than in the euphotic zone. Bacterial community structure, assessed by Automated Ribosomal Intergenic Spacer Analysis (ARISA), was depth-specific. A distance based linear model (DistLM) revealed that the variability found in bacterial community structure was mainly explained by temperature nitrate, phosphate, dissolved organic matter (DOM) fluorescence, prokaryotic abundance, leucine incorporation and to a lesser extent salinity, oxygen, CDOM absorbance and dissolved organic carbon concentration. Our results displayed a bacterial community structure shaped not only by depth-related physicochemical features but also by DOM quality, indicating that different prokaryotic taxa have the potential to metabolize particular DOM sources.

  9. Dynamic Model Tests of Models in the McDonnell Design of Project Mercury Capsule

    NASA Technical Reports Server (NTRS)

    1959-01-01

    Dynamic Model Tests of Models in the McDonnell Design of Project Mercury Capsule in the Langley 20-Foot Free-Spinning Tunnel. On 11 May 1959, 24 tests of the aerodynamic response of the McDonnell model Project Mercury capsule were conducted. The initial test demonstrated free-fall; a parachute was used in the remaining test. Several tests included the addition of baffles. [Entire movie available on DVD from CASI as Doc ID 20070030952. Contact help@sti.nasa.gov

  10. Satisfiers and Dissatisfiers: A Two-Factor Model for Website Design and Evaluation.

    ERIC Educational Resources Information Center

    Zhang, Ping; von Dran, Gisela M.

    2000-01-01

    Investigates Web site design factors and their impact from a theoretical perspective. Presents a two-factor model that can guide Web site design and evaluation. According to the model, there are two types of design factors: hygiene and motivator. Results showed that the two-factor model provides a means for Web-user interface studies. Provides…

  11. The Learner-Centered Instructional Design Model: A Modified Delphi Study

    ERIC Educational Resources Information Center

    Melsom, Duane Allan

    2010-01-01

    The learner-centered instructional design model redefines the standard linear instructional design model to form a circular model where the learner's needs are the first item considered in the development of instruction. The purpose of this modified Delphi study was to have a panel of experts in the instructional design field review the…

  12. Toward the design of sustainable biofuel landscapes: A modeling approach

    NASA Astrophysics Data System (ADS)

    Izaurralde, R. C.; Zhang, X.; Manowitz, D. H.; Sahajpal, R.

    2011-12-01

    Biofuel crops have emerged as promising feedstocks for advanced bioenergy production in the form of cellulosic ethanol and biodiesel. However, large-scale deployment of biofuel crops for energy production has the potential to conflict with food production and generate a myriad of environmental outcomes related to land and water resources (e.g., decreases in soil carbon storage, increased erosion, altered runoff, deterioration in water quality). In order to anticipate the possible impacts of biofuel crop production on food production systems and the environment and contribute to the design of sustainable biofuel landscapes, we developed a spatially-explicit integrated modeling framework (SEIMF) aimed at understanding, among other objectives, the complex interactions among land, water, and energy. The framework is a research effort of the DOE Great Lakes Bioenergy Research Center. The SEIMF has three components: (1) a GIS-based data analysis system, (2) the biogeochemical model EPIC (Environmental Policy Integrated Climate), and (3) an evolutionary multi-objective optimization algorithm for examining trade-offs between biofuel energy production and ecosystem responses. The SEIMF was applied at biorefinery scale to simulate biofuel production scenarios and the yield and environmental results were used to develop trade-offs, economic and life-cycle analyses. The SEIMF approach was also applied to test the hypothesis that growing perennial herbaceous species on marginal lands can satisfy a significant fraction of targeted demands while avoiding competition with food systems and maintaining ecosystem services.

  13. Bio-inspired design of dental multilayers: experiments and model.

    PubMed

    Niu, Xinrui; Rahbar, Nima; Farias, Stephen; Soboyejo, Wole

    2009-12-01

    This paper combines experiments, simulations and analytical modeling that are inspired by the stress reductions associated with the functionally graded structures of the dentin-enamel-junctions (DEJs) in natural teeth. Unlike conventional crown structures in which ceramic crowns are bonded to the bottom layer with an adhesive layer, real teeth do not have a distinct "adhesive layer" between the enamel and the dentin layers. Instead, there is a graded transition from enamel to dentin within a approximately 10 to 100 microm thick regime that is called the Dentin Enamel Junction (DEJ). In this paper, a micro-scale, bio-inspired functionally graded structure is used to bond the top ceramic layer (zirconia) to a dentin-like ceramic-filled polymer substrate. The bio-inspired functionally graded material (FGM) is shown to exhibit higher critical loads over a wide range of loading rates. The measured critical loads are predicted using a rate dependent slow crack growth (RDEASCG) model. The implications of the results are then discussed for the design of bio-inspired dental multilayers.

  14. Model and design of dielectric elastomer minimum energy structures

    NASA Astrophysics Data System (ADS)

    Rosset, Samuel; Araromi, Oluwaseun A.; Shintake, Jun; Shea, Herbert R.

    2014-08-01

    Fixing a prestretched dielectric elastomer actuator (DEA) on a flexible frame allows transformation of the intrinsic in-plane area expansion of DEAs into complex three-dimensional (3D) structures whose shape is determined by a configuration that minimizes the elastic energy of the actuator and the bending energy of the frame. These stuctures can then unfold upon the application of a voltage. This article presents an analytical modelling of the dielectric elastomer minimal energy structure in the case of a simple rectangular geometry and studies the influence of the main design parameters on the actuator's behaviour. The initial shape of DEMES, as well as the actuation range, depends on the elastic strain energy stored in the elastomeric membrane. This energy depends on two independent parameters: the volume of the membrane and its initial deformation. There exist therefore different combinations of membrane volume and prestretch, which lead to the same initial shape, such as a highly prestretched thin membrane, or a slightly prestretched thick membrane. Although they have the same initial shape, these different membrane states lead to different behaviour once the actuation voltage is applied. Our model allows one to predict which choice of parameters leads to the largest actuation range, while specifying the impact of the different membrane conditions on the spring constant of the device. We also explore the effects of non-ideal material behaviour, such as stress relaxation, on device performance.

  15. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  16. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal, and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineers perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(sub x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for 'graceful' rather than catastrophic failure. When loaded in the fiber direction these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  17. A Geospatial Model for Remedial Design Optimization and Performance Evaluation

    SciTech Connect

    Madrid, V M; Demir, Z; Gregory, S; Valett, J; Halden, R U

    2002-02-19

    Soil and ground water remediation projects require collection and interpretation of large amounts of spatial data. Two-dimensional (2D) mapping techniques are often inadequate for characterizing complex subsurface conditions at contaminated sites. To interpret data from these sites, we developed a methodology that allows integration of multiple, three-dimensional (3D) data sets for spatial analysis. This methodology was applied to the Department of Energy (DOE) Building 834 Operable Unit at Lawrence Livermore National Laboratory Site 300, in central California. This site is contaminated with a non-aqueous phase liquid (NAPL) mixture consisting of trichloroethene (TCE) and tetrakis (2-ethylbutoxy) silane (TKEBS). In the 1960s and 1970s, releases of this heat-exchange fluid to the environment resulted in TCE concentrations up to 970 mg/kg in soil and dissolved-phase concentrations approaching the solubility limit in a shallow, perched water-bearing zone. A geospatial model was developed using site hydrogeological data, and monitoring data for volatile organic compounds (VOCs) and biogeochemical parameters. The model was used to characterize the distribution of contamination in different geologic media, and to track changes in subsurface contaminant mass related to treatment facility operation and natural attenuation processes. Natural attenuation occurs mainly as microbial reductive dechlorination of TCE which is dependent on the presence of TKEBS, whose fermentation provides the hydrogen required for microbial reductive dechlorination of VOCs. Output of the geospatial model shows that soil vapor extraction (SVE) is incompatible with anaerobic VOC transformation, presumably due to temporary microbial inhibition caused by oxygen influx into the subsurface. Geospatial analysis of monitoring data collected over a three-year period allowed for generation of representative monthly VOC plume maps and dissolved-phase mass estimates. The latter information proved to be

  18. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes.

  19. Predictive Modeling in Plasma Reactor and Process Design

    NASA Technical Reports Server (NTRS)

    Hash, D. B.; Bose, D.; Govindan, T. R.; Meyyappan, M.; Arnold, James O. (Technical Monitor)

    1997-01-01

    Research continues toward the improvement and increased understanding of high-density plasma tools. Such reactor systems are lauded for their independent control of ion flux and energy enabling high etch rates with low ion damage and for their improved ion velocity anisotropy resulting from thin collisionless sheaths and low neutral pressures. Still, with the transition to 300 mm processing, achieving etch uniformity and high etch rates concurrently may be a formidable task for such large diameter wafers for which computational modeling can play an important role in successful reactor and process design. The inductively coupled plasma (ICP) reactor is the focus of the present investigation. The present work attempts to understand the fundamental physical phenomena of such systems through computational modeling. Simulations will be presented using both computational fluid dynamics (CFD) techniques and the direct simulation Monte Carlo (DSMC) method for argon and chlorine discharges. ICP reactors generally operate at pressures on the order of 1 to 10 mTorr. At such low pressures, rarefaction can be significant to the degree that the constitutive relations used in typical CFD techniques become invalid and a particle simulation must be employed. This work will assess the extent to which CFD can be applied and evaluate the degree to which accuracy is lost in prediction of the phenomenon of interest; i.e., etch rate. If the CFD approach is found reasonably accurate and bench-marked with DSMC and experimental results, it has the potential to serve as a design tool due to the rapid time relative to DSMC. The continuum CFD simulation solves the governing equations for plasma flow using a finite difference technique with an implicit Gauss-Seidel Line Relaxation method for time marching toward a converged solution. The equation set consists of mass conservation for each species, separate energy equations for the electrons and heavy species, and momentum equations for the gas

  20. Cubical Mass-Spring Model design based on a tensile deformation test and nonlinear material model.

    PubMed

    San-Vicente, Gaizka; Aguinaga, Iker; Tomás Celigüeta, Juan

    2012-02-01

    Mass-Spring Models (MSMs) are used to simulate the mechanical behavior of deformable bodies such as soft tissues in medical applications. Although they are fast to compute, they lack accuracy and their design remains still a great challenge. The major difficulties in building realistic MSMs lie on the spring stiffness estimation and the topology identification. In this work, the mechanical behavior of MSMs under tensile loads is analyzed before studying the spring stiffness estimation. In particular, the performed qualitative and quantitative analysis of the behavior of cubical MSMs shows that they have a nonlinear response similar to hyperelastic material models. According to this behavior, a new method for spring stiffness estimation valid for linear and nonlinear material models is proposed. This method adjusts the stress-strain and compressibility curves to a given reference behavior. The accuracy of the MSMs designed with this method is tested taking as reference some soft-tissue simulations based on nonlinear Finite Element Method (FEM). The obtained results show that MSMs can be designed to realistically model the behavior of hyperelastic materials such as soft tissues and can become an interesting alternative to other approaches such as nonlinear FEM. PMID:22156291

  1. Design, characterization and modeling of biobased acoustic foams

    NASA Astrophysics Data System (ADS)

    Ghaffari Mosanenzadeh, Shahrzad

    Polymeric open cell foams are widely used as sound absorbers in sectors such as automobile, aerospace, transportation and building industries, yet there is a need to improve sound absorption of these foams through understanding the relation between cell morphology and acoustic properties of porous material. Due to complicated microscopic structure of open cell foams, investigating the relation between foam morphology and acoustic properties is rather intricate and still an open problem in the field. The focus of this research is to design and develop biobased open cell foams for acoustic applications to replace conventional petrochemical based foams as well as investigating the link between cell morphology and macroscopic properties of open cell porous structures. To achieve these objectives, two industrially produced biomaterials, polylactide (PLA) and polyhydroxyalkanoate (PHA) and their composites were examined and highly porous biobased foams were fabricated by particulate leaching and compression molding. Acoustic absorption capability of these foams was enhanced utilizing the effect of co-continuous blends to form a bimodal porous structure. To tailor mechanical and acoustic properties of biobased foams, blends of PLA and PHA were studied to reach the desired mechanical and viscoelastic properties. To enhance acoustic properties of porous medium for having a broad band absorption effect, cell structure must be appropriately graded. Such porous structures with microstructural gradation are called Functionally Graded Materials (FGM). A novel graded foam structure was designed with superior sound absorption to demonstrate the effect of cell arrangement on performance of acoustic fixtures. Acoustic measurements were performed in a two microphone impedance tube and acoustic theory of Johnson-Champoux-Allard was applied to the fabricated foams to determine micro cellular properties such as tortuosity, viscous and thermal lengths from sound absorption impedance tube

  2. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    PubMed

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  3. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    PubMed

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow. PMID:24911500

  4. Evaluating experimental design for soil-plant model selection with Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang; Gayler, Sebastian

    2013-04-01

    The objective selection of appropriate models for realistic simulations of coupled soil-plant processes is a challenging task since the processes are complex, not fully understood at larger scales, and highly non-linear. Also, comprehensive data sets are scarce, and measurements are uncertain. In the past decades, a variety of different models have been developed that exhibit a wide range of complexity regarding their approximation of processes in the coupled model compartments. We present a method for evaluating experimental design for maximum confidence in the model selection task. The method considers uncertainty in parameters, measurements and model structures. Advancing the ideas behind Bayesian Model Averaging (BMA), the model weights in BMA are perceived as uncertain quantities with assigned probability distributions that narrow down as more data are made available. This allows assessing the power of different data types, data densities and data locations in identifying the best model structure from among a suite of plausible models. The models considered in this study are the crop models CERES, SUCROS, GECROS and SPASS, which are coupled to identical routines for simulating soil processes within the modelling framework Expert-N. The four models considerably differ in the degree of detail at which crop growth and root water uptake are represented. Monte-Carlo simulations were conducted for each of these models considering their uncertainty in soil hydraulic properties and selected crop model parameters. The models were then conditioned on field measurements of soil moisture, leaf-area index (LAI), and evapotranspiration rates (from eddy-covariance measurements) during a vegetation period of winter wheat at the Nellingen site in Southwestern Germany. Following our new method, we derived the BMA model weights (and their distributions) when using all data or different subsets thereof. We discuss to which degree the posterior BMA mean outperformed the prior BMA

  5. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design: part II. Model application.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    A new stochastic optimization model under modeling uncertainty (SOMUM) and parameter certainty is applied to a practical site located in western Canada. Various groundwater remediation strategies under different significance levels are obtained from the SOMUM model. The impact of modeling uncertainty (proxy-simulator residuals) on optimal remediation strategies is compared to that of parameter uncertainty (arising from physical properties). The results show that the increased remediation cost for mitigating modeling-uncertainty impact would be higher than those from models where the coefficient of variance of input parameters approximates to 40%. This provides new evidence that the modeling uncertainty in proxy-simulator residuals can hardly be ignored; there is thus a need of investigating and mitigating the impact of such uncertainties on groundwater remediation design. This work would be helpful for lowering the risk of system failure due to potential environmental-standard violation when determining optimal groundwater remediation strategies.

  6. MODeLeR: A Virtual Constructivist Learning Environment and Methodology for Object-Oriented Design

    ERIC Educational Resources Information Center

    Coffey, John W.; Koonce, Robert

    2008-01-01

    This article contains a description of the organization and method of use of an active learning environment named MODeLeR, (Multimedia Object Design Learning Resource), a tool designed to facilitate the learning of concepts pertaining to object modeling with the Unified Modeling Language (UML). MODeLeR was created to provide an authentic,…

  7. Interim Service ISDN Satellite (ISIS) network model for advanced satellite designs and experiments

    NASA Technical Reports Server (NTRS)

    Pepin, Gerard R.; Hager, E. Paul

    1991-01-01

    The Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) Network Model for Advanced Satellite Designs and Experiments describes a model suitable for discrete event simulations. A top-down model design uses the Advanced Communications Technology Satellite (ACTS) as its basis. The ISDN modeling abstractions are added to permit the determination and performance for the NASA Satellite Communications Research (SCAR) Program.

  8. Community Driven Introduction to Social Work Course

    ERIC Educational Resources Information Center

    Riffe, Holly A.; Olson, Carole J.

    2012-01-01

    Introductory courses in undergraduate social work and human services programs have received scant attention in the literature despite having considerable responsibility in their curricula. In addition to providing a foundation for the courses that follow, they introduce students to the diverse settings in which social work and human services…

  9. Role of Modeling When Designing for Absolute Energy Use Intensity Requirements in a Design-Build Framework: Preprint

    SciTech Connect

    Hirsch, A.; Pless, S.; Guglielmetti, R.; Torcellini, P. A.; Okada, D.; Antia, P.

    2011-03-01

    The Research Support Facility was designed to use half the energy of an equivalent minimally code-compliant building, and to produce as much renewable energy as it consumes on an annual basis. These energy goals and their substantiation through simulation were explicitly included in the project's fixed firm price design-build contract. The energy model had to be continuously updated during the design process and to match the final building as-built to the greatest degree possible. Computer modeling played a key role throughout the design process and in verifying that the contractual energy goals would be met within the specified budget. The main tool was a whole building energy simulation program. Other models were used to provide more detail or to complement the whole building simulation tool. Results from these specialized models were fed back into the main whole building simulation tool to provide the most accurate possible inputs for annual simulations. This paper will detail the models used in the design process and how they informed important program and design decisions on the path from preliminary design to the completed building.

  10. Empirical fitness models for hepatitis C virus immunogen design

    NASA Astrophysics Data System (ADS)

    Hart, Gregory R.; Ferguson, Andrew L.

    2015-12-01

    Hepatitis C virus (HCV) afflicts 170 million people worldwide, 2%–3% of the global population, and kills 350 000 each year. Prophylactic vaccination offers the most realistic and cost effective hope of controlling this epidemic in the developing world where expensive drug therapies are not available. Despite 20 years of research, the high mutability of the virus and lack of knowledge of what constitutes effective immune responses have impeded development of an effective vaccine. Coupling data mining of sequence databases with spin glass models from statistical physics, we have developed a computational approach to translate clinical sequence databases into empirical fitness landscapes quantifying the replicative capacity of the virus as a function of its amino acid sequence. These landscapes explicitly connect viral genotype to phenotypic fitness, and reveal vulnerable immunological targets within the viral proteome that can be exploited to rationally design vaccine immunogens. We have recovered the empirical fitness landscape for the HCV RNA-dependent RNA polymerase (protein NS5B) responsible for viral genome replication, and validated the predictions of our model by demonstrating excellent accord with experimental measurements and clinical observations. We have used our landscapes to perform exhaustive in silico screening of 16.8 million T-cell immunogen candidates to identify 86 optimal formulations. By reducing the search space of immunogen candidates by over five orders of magnitude, our approach can offer valuable savings in time, expense, and labor for experimental vaccine development and accelerate the search for a HCV vaccine. Abbreviations: HCV—hepatitis C virus, HLA—human leukocyte antigen, CTL—cytotoxic T lymphocyte, NS5B—nonstructural protein 5B, MSA—multiple sequence alignment, PEG-IFN—pegylated interferon.

  11. Empirical fitness models for hepatitis C virus immunogen design

    NASA Astrophysics Data System (ADS)

    Hart, Gregory R.; Ferguson, Andrew L.

    2015-12-01

    Hepatitis C virus (HCV) afflicts 170 million people worldwide, 2%-3% of the global population, and kills 350 000 each year. Prophylactic vaccination offers the most realistic and cost effective hope of controlling this epidemic in the developing world where expensive drug therapies are not available. Despite 20 years of research, the high mutability of the virus and lack of knowledge of what constitutes effective immune responses have impeded development of an effective vaccine. Coupling data mining of sequence databases with spin glass models from statistical physics, we have developed a computational approach to translate clinical sequence databases into empirical fitness landscapes quantifying the replicative capacity of the virus as a function of its amino acid sequence. These landscapes explicitly connect viral genotype to phenotypic fitness, and reveal vulnerable immunological targets within the viral proteome that can be exploited to rationally design vaccine immunogens. We have recovered the empirical fitness landscape for the HCV RNA-dependent RNA polymerase (protein NS5B) responsible for viral genome replication, and validated the predictions of our model by demonstrating excellent accord with experimental measurements and clinical observations. We have used our landscapes to perform exhaustive in silico screening of 16.8 million T-cell immunogen candidates to identify 86 optimal formulations. By reducing the search space of immunogen candidates by over five orders of magnitude, our approach can offer valuable savings in time, expense, and labor for experimental vaccine development and accelerate the search for a HCV vaccine. Abbreviations: HCV—hepatitis C virus, HLA—human leukocyte antigen, CTL—cytotoxic T lymphocyte, NS5B—nonstructural protein 5B, MSA—multiple sequence alignment, PEG-IFN—pegylated interferon.

  12. Models of Change: The Future of Design Education

    ERIC Educational Resources Information Center

    Baynes, Ken; Baynes, Brochocka

    2010-01-01

    This paper discusses design and design education in the context of four major social and environmental concerns identified by Bruce Archer in 1973: overpopulation; pollution; depletion of natural resources; control. It argues for the social and economic importance of design education in primary and secondary schools. It identifies "designerly…

  13. A Bayesian Nonparametric Causal Model for Regression Discontinuity Designs

    ERIC Educational Resources Information Center

    Karabatsos, George; Walker, Stephen G.

    2013-01-01

    The regression discontinuity (RD) design (Thistlewaite & Campbell, 1960; Cook, 2008) provides a framework to identify and estimate causal effects from a non-randomized design. Each subject of a RD design is assigned to the treatment (versus assignment to a non-treatment) whenever her/his observed value of the assignment variable equals or…

  14. A model based technique for the design of flight directors. [optimal control models

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1973-01-01

    A new technique for designing flight directors is discussed. This technique uses the optimal-control pilot/vehicle model to determine the appropriate control strategy. The dynamics of this control strategy are then incorporated into the director control laws, thereby enabling the pilot to operate at a significantly lower workload. A preliminary design of a control director for maintaining a STOL vehicle on the approach path in the presence of random air turbulence is evaluated. By selecting model parameters in terms of allowable path deviations and pilot workload levels, a set of director laws is achieved which allows improved system performance at reduced workload levels. The pilot acts essentially as a proportional controller with regard to the director signals, and control motions are compatible with those appropriate to status-only displays.

  15. The design-by-treatment interaction model: a unifying framework for modelling loop inconsistency in network meta-analysis.

    PubMed

    Jackson, Dan; Boddington, Paul; White, Ian R

    2016-09-01

    In this note, we clarify and prove the claim made Higgins et al. () that the design-by-treatment interaction model contains all possible loop inconsistency models. This claim provides a strong argument for using the design-by-treatment interaction model to describe loop inconsistencies in network meta-analysis. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd.

  16. A Model of Research Paper Writing Instructional Materials for Academic Writing Course: "Needs & Documents Analysis and Model Design"

    ERIC Educational Resources Information Center

    Ghufron, M. Ali; Saleh, Mursid; Warsono; Sofwan, Ahmad

    2016-01-01

    This study aimed at designing a model of instructional materials for Academic Writing Course focusing on research paper writing. The model was designed based on the Curriculum at the English Education Study Program, Faculty of Language and Art Education of IKIP PGRI Bojonegoro, East Java, Indonesia. This model was developed in order to improve…

  17. Computationally-designed phenylephrine prodrugs - a model for enhancing bioavailability

    NASA Astrophysics Data System (ADS)

    Karaman, Rafik; Karaman, Donia; Zeiadeh, Isra'

    2013-11-01

    DFT calculations at B3LYP 6-31G (d,p) for intramolecular proton transfer in a number of Kirby's enzyme models demonstrated that the driving force for the proton transfer efficiency is the distance between the two reactive centres (rGM) and the attack angle (α); and the rate of the reaction is linearly correlated with rGM2 and sin (180°- α). Based on these results three phenylephrine prodrugs were designed to provide phenylephrine with higher bioavailability than their parent drug. Using the experimental t1/2 (the time needed for the conversion of 50% of the reactants to products) and EM (effective molarity) values for these processes the t1/2 values for the conversion of the three prodrugs to the parent drug, phenylephrine were calculated. The calculated t1/2 values for ProD 1 and ProD 2 were very high (145 days and several years, respectively) whereas that of ProD 3 was found to be about 35 hours. Therefore, the intra-conversion rates of the phenylephrine prodrugs to phenylephrine can be programmed according to the nature of the prodrug linker.

  18. Design and modeling of a light powered biomimicry micropump

    NASA Astrophysics Data System (ADS)

    Sze, Tsun-kay Jackie; Liu, Jin; Dutta, Prashanta

    2015-06-01

    The design of compact micropumps to provide steady flow has been an on-going challenge in the field of microfluidics. In this work, a novel micropump concept is introduced utilizing bacteriorhodopsin and sugar transporter proteins. The micropump utilizes light energy to activate the transporter proteins, which create an osmotic pressure gradient and drive the fluid flow. The capability of the bio inspired micropump is demonstrated using a quasi 1D numerical model, where the contributions of bacteriorhodopsin and sugar transporter proteins are taken care of by appropriate flux boundary conditions in the flow channel. Proton flux created by the bacteriorhodopsin proteins is compared with experimental results to obtain the appropriate working conditions of the proteins. To identify the pumping capability, we also investigate the influences of several key parameters, such as the membrane fraction of transporter proteins, membrane proton permeability and the presence of light. Our results show that there is a wide bacteriorhodopsin membrane fraction range (from 0.2 to 10%) at which fluid flow stays nearly at its maximum value. Numerical results also indicate that lipid membranes with low proton permeability can effectively control the light source as a method to turn on/off fluid flow. This capability allows the micropump to be activated and shut off remotely without bulky support equipment. In comparison with existing micropumps, this pump generates higher pressures than mechanical pumps. It can produce peak fluid flow and shutoff head comparable to other non-mechanical pumps.

  19. Designs for life: protocell models in the laboratory.

    PubMed

    Dzieciol, Alicja J; Mann, Stephen

    2012-01-01

    Compartmentalization of primitive biochemical reactions within membrane-bound water micro-droplets is considered an essential step in the origin of life. In the absence of complex biochemical machinery, the hypothetical precursors to the first biological cells (protocells) would be dependent on the self-organization of their components and physicochemical conditions of the environment to attain a basic level of autonomy and evolutionary viability. Many researchers consider the self-organization of lipid and fatty acid molecules into bilayer vesicles as a simple form of membrane-based compartmentalization that can be developed for the experimental design and construction of plausible protocell models. In this tutorial review, we highlight some of the recent advances and issues concerning the construction of simple cell-like systems in the laboratory. Overcoming many of the current scientific challenges should lead to new types of chemical bio-reactors and artificial cell-like entities, and bring new insights concerning the possible pathways responsible for the origin of life. PMID:21952478

  20. Exascale Co-design for Modeling Materials in Extreme Environments

    SciTech Connect

    Germann, Timothy C.

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  1. Planar CMOS analog SiPMs: design, modeling, and characterization

    NASA Astrophysics Data System (ADS)

    Zou, Yu; Villa, Federica; Bronzi, Danilo; Tisa, Simone; Tosi, Alberto; Zappa, Franco

    2015-11-01

    Silicon photomultipliers (SiPMs) are large area detectors consisting of an array of single-photon-sensitive microcells, which make SiPMs extremely attractive to substitute the photomultiplier tubes in many applications. We present the design, fabrication, and characterization of analog SiPMs in standard planar 0.35 μm CMOS technology, with about 1 mm × 1 mm total area and different kinds of microcells, based on single-photon avalanche diodes with 30 μm diameter reaching 21.0% fill-factor (FF), 50 μm diameter (FF = 58.3%) or 50 μm square active area with rounded corner of 5 μm radius (FF = 73.7%). We also developed the electrical SPICE model for CMOS SiPMs. Our CMOS SiPMs have 25 V breakdown voltage, in line with most commercial SiPMs and higher gain (8.8 × 106, 13.2 × 106, and 15.0 × 106, respectively). Although dark count rate density is slightly higher than state-of-the-art analog SiPMs, the proposed standard CMOS processing opens the feasibility of integration with active electronics, for switching hot pixels off, drastically reducing the overall dark count rate, or for further on-chip processing.

  2. Designing the optimal convolution kernel for modeling the motion blur

    NASA Astrophysics Data System (ADS)

    Jelinek, Jan

    2011-06-01

    Motion blur acts on an image like a two dimensional low pass filter, whose spatial frequency characteristic depends both on the trajectory of the relative motion between the scene and the camera and on the velocity vector variation along it. When motion during exposure is permitted, the conventional, static notions of both the image exposure and the scene-toimage mapping become unsuitable and must be revised to accommodate the image formation dynamics. This paper develops an exact image formation model for arbitrary object-camera relative motion with arbitrary velocity profiles. Moreover, for any motion the camera may operate in either continuous or flutter shutter exposure mode. Its result is a convolution kernel, which is optimally designed for both the given motion and sensor array geometry, and hence permits the most accurate computational undoing of the blurring effects for the given camera required in forensic and high security applications. The theory has been implemented and a few examples are shown in the paper.

  3. Learning to Be: The Modelling of Art and Design Practice in University Art and Design Teaching

    ERIC Educational Resources Information Center

    Budge, Kylie

    2016-01-01

    Learning to be an artist or designer is a complex process of becoming. Much of the early phase of "learning to be" occurs during the time emerging artists and designers are students in university art/design programmes, both undergraduate and postgraduate. Recent research reveals that a critical role in assisting students in their…

  4. Algorithms of D-optimal designs for Morgan Mercer Flodin (MMF) models with three parameters

    NASA Astrophysics Data System (ADS)

    Widiharih, Tatik; Haryatmi, Sri; Gunardi, Wilandari, Yuciana

    2016-02-01

    Morgan Mercer Flodin (MMF) model is used in many areas including biological growth studies, animal and husbandry, chemistry, finance, pharmacokinetics and pharmacodynamics. Locally D-optimal designs for Morgan Mercer Flodin (MMF) models with three parameters are investigated. We used the Generalized Equivalence Theorem of Kiefer and Wolvowitz to determine D-optimality criteria. Number of roots for standardized variance are determined using Tchebysheff system concept and it is used to decide that the design is minimally supported design. In these models, designs are minimally supported designs with uniform weight on its support, and the upper bound of the design region is a support point.

  5. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  6. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  7. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  8. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  9. Design Science Research toward Designing/Prototyping a Repeatable Model for Testing Location Management (LM) Algorithms for Wireless Networking

    ERIC Educational Resources Information Center

    Peacock, Christopher

    2012-01-01

    The purpose of this research effort was to develop a model that provides repeatable Location Management (LM) testing using a network simulation tool, QualNet version 5.1 (2011). The model will provide current and future protocol developers a framework to simulate stable protocol environments for development. This study used the Design Science…

  10. Designing a Model of a Digital Ecosystem for Healthcare and Wellness Using the Business Model Canvas.

    PubMed

    León, María Cosio; Nieto-Hipólito, Juan Ivan; Garibaldi-Beltrán, Julián; Amaya-Parra, Guillermo; Luque-Morales, Priscy; Magaña-Espinoza, Pedro; Aguilar-Velazco, José

    2016-06-01

    Wellness is a term often used to talk about optimal health as "dynamic balance of physical, emotional, social, spiritual, and intellectual health." While healthcare is a term about care offered to patients for improving their health. We use both terms, as well as the Business Model Canvas (BMC) methodology, to design a digital ecosystem model for healthcare and wellness called DE4HW; the model considers economic, technological, and legal asymmetries, which are present on e-services beyond geographical regions. BMC methodology was embedded into the global project strategy called: IBOT (Initiate, Build, Operate and Transfer); it is a methodology to establish a functional, integrated national telemedicine network and virtual education network; of which we took its phases rationale. The results in this work illustrate the design of DE4HW model, into the first phase of IBOT, enriched with the BMC, which enables us to define actors, their interactions, rules and protocols, in order to build DE4HW, while IBOT strategy manages the project goal, up to the transfer phase, where an integral service platform of healthcare and wellness is turned over to stakeholders. PMID:27118010

  11. Designing a Model of a Digital Ecosystem for Healthcare and Wellness Using the Business Model Canvas.

    PubMed

    León, María Cosio; Nieto-Hipólito, Juan Ivan; Garibaldi-Beltrán, Julián; Amaya-Parra, Guillermo; Luque-Morales, Priscy; Magaña-Espinoza, Pedro; Aguilar-Velazco, José

    2016-06-01

    Wellness is a term often used to talk about optimal health as "dynamic balance of physical, emotional, social, spiritual, and intellectual health." While healthcare is a term about care offered to patients for improving their health. We use both terms, as well as the Business Model Canvas (BMC) methodology, to design a digital ecosystem model for healthcare and wellness called DE4HW; the model considers economic, technological, and legal asymmetries, which are present on e-services beyond geographical regions. BMC methodology was embedded into the global project strategy called: IBOT (Initiate, Build, Operate and Transfer); it is a methodology to establish a functional, integrated national telemedicine network and virtual education network; of which we took its phases rationale. The results in this work illustrate the design of DE4HW model, into the first phase of IBOT, enriched with the BMC, which enables us to define actors, their interactions, rules and protocols, in order to build DE4HW, while IBOT strategy manages the project goal, up to the transfer phase, where an integral service platform of healthcare and wellness is turned over to stakeholders.

  12. A Learner-Based Design Model for Interactive Multimedia Language Learning Packages.

    ERIC Educational Resources Information Center

    Watts, Noel

    1997-01-01

    Examines the design features of interactive multimedia packages for second language learning. Focuses on the possible components of a design model and highlights the implications for program design. Concludes that to realize the high potential for interactive language learning multimedia, designers must develop a more learner-based orientation.…

  13. Rapid E-learning Development Strategies and a Multimedia Project Design Model

    ERIC Educational Resources Information Center

    Sözcü, Ömer Faruk; Ipek, Ismail

    2014-01-01

    The purpose of the study is to discuss e-learning design strategies which can be used for multimedia projects as a design model. Recent advances in instructional technologies have been found to be very important in the design of training courses by using rapid instructional design (ID) approaches. The approaches were developed to use in training…

  14. The design and implementation of an operational model evaluation system

    SciTech Connect

    Foster, K.T.

    1995-06-01

    An evaluation of an atmospheric transport and diffusion model`s operational performance typically involves the comparison of the model`s calculations with measurements of an atmospheric pollutant`s temporal and spatial distribution. These evaluations however often use data from a small number of experiments and may be limited to producing some of the commonly quoted statistics based on the differences between model calculations and the measurements. This paper presents efforts to develop a model evaluation system geared for both the objective statistical analysis and the more subjective visualization of the inter-relationships between a model`s calculations and the appropriate field measurement data.

  15. Modeling Alveolar Epithelial Cell Behavior In Spatially Designed Hydrogel Microenvironments

    NASA Astrophysics Data System (ADS)

    Lewis, Katherine Jean Reeder

    The alveolar epithelium consists of two cell phenotypes, elongated alveolar type I cells (AT1) and rounded alveolar type II cells (ATII), and exists in a complex three-dimensional environment as a polarized cell layer attached to a thin basement membrane and enclosing a roughly spherical lumen. Closely surrounding the alveolar cysts are capillary endothelial cells as well as interstitial pulmonary fibroblasts. Many factors are thought to influence alveolar epithelial cell differentiation during lung development and wound repair, including physical and biochemical signals from the extracellular matrix (ECM), and paracrine signals from the surrounding mesenchyme. In particular, disrupted signaling between the alveolar epithelium and local fibroblasts has been implicated in the progression of several pulmonary diseases. However, given the complexity of alveolar tissue architecture and the multitude of signaling pathways involved, designing appropriate experimental platforms for this biological system has been difficult. In order to isolate key factors regulating cellular behavior, the researcher ideally should have control over biophysical properties of the ECM, as well as the ability to organize multiple cell types within the scaffold. This thesis aimed to develop a 3D synthetic hydrogel platform to control alveolar epithelial cyst formation, which could then be used to explore how extracellular cues influence cell behavior in a tissue-relevant cellular arrangement. To accomplish this, a poly(ethylene glycol) (PEG) hydrogel network containing enzymatically-degradable crosslinks and bioadhesive pendant peptides was employed as a base material for encapsulating primary alveolar epithelial cells. First, an array of microwells of various cross-sectional shapes was photopatterned into a PEG gel containing photo-labile crosslinks, and primary ATII cells were seeded into the wells to examine the role of geometric confinement on differentiation and multicellular arrangement

  16. Aerospace structural design process improvement using systematic evolutionary structural modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Michael

    2000-10-01

    A multidisciplinary team tasked with an aircraft design problem must understand the problem requirements and metrics to produce a successful design. This understanding entails not only knowledge of what these requirements and metrics are, but also how they interact, which are most important (to the customer as well as to aircraft performance), and who in the organization can provide pertinent knowledge for each. In recent years, product development researchers and organizations have developed and successfully applied a variety of tools such as Quality Function Deployment (QFD) to coordinate multidisciplinary team members. The effectiveness of these methods, however, depends on the quality and fidelity of the information that team members can input. In conceptual aircraft design, structural information is of lower quality compared to aerodynamics or performance because it is based on experience rather than theory. This dissertation shows how advanced structural design tools can be used in a multidisciplinary team setting to improve structural information generation and communication through a systematic evolution of structural detail. When applied to conceptual design, finite element-based structural design tools elevate structural information to the same level as other computationally supported disciplines. This improved ability to generate and communicate structural information enables a design team to better identify and meet structural design requirements, consider producibility issues earlier, and evaluate structural concepts. A design process experiment of a wing structural layout in collaboration with an industrial partner illustrates and validates the approach.

  17. Advances in design and modeling of porous materials

    NASA Astrophysics Data System (ADS)

    Ayral, André; Calas-Etienne, Sylvie; Coasne, Benoit; Deratani, André; Evstratov, Alexis; Galarneau, Anne; Grande, Daniel; Hureau, Matthieu; Jobic, Hervé; Morlay, Catherine; Parmentier, Julien; Prelot, Bénédicte; Rossignol, Sylvie; Simon-Masseron, Angélique; Thibault-Starzyk, Frédéric

    2015-07-01

    This special issue of the European Physical Journal Special Topics is dedicated to selected papers from the symposium "High surface area porous and granular materials" organized in the frame of the conference "Matériaux 2014", held on November 24-28, 2014 in Montpellier, France. Porous materials and granular materials gather a wide variety of heterogeneous, isotropic or anisotropic media made of inorganic, organic or hybrid solid skeletons, with open or closed porosity, and pore sizes ranging from the centimeter scale to the sub-nanometer scale. Their technological and industrial applications cover numerous areas from building and civil engineering to microelectronics, including also metallurgy, chemistry, health, waste water and gas effluent treatment. Many emerging processes related to environmental protection and sustainable development also rely on this class of materials. Their functional properties are related to specific transfer mechanisms (matter, heat, radiation, electrical charge), to pore surface chemistry (exchange, adsorption, heterogeneous catalysis) and to retention inside confined volumes (storage, separation, exchange, controlled release). The development of innovative synthesis, shaping, characterization and modeling approaches enables the design of advanced materials with enhanced functional performance. The papers collected in this special issue offer a good overview of the state-of-the-art and science of these complex media. We would like to thank all the speakers and participants for their contribution to the success of the symposium. We also express our gratitude to the organization committee of "Matériaux 2014". We finally thank the reviewers and the staff of the European Physical Journal Special Topics who made the publication of this special issue possible.

  18. Macrocomposite mechanical design, modeling, and behavior of physical models of bioinspired fish armor

    NASA Astrophysics Data System (ADS)

    Browning, Ashley; Ortiz, Christine; Boyce, Mary C.

    2012-02-01

    The macrocomposite design of flexible biological exoskeletons, consisting of overlapping mineralized armor units embedded in a compliant tissue, is a key determinant of their mechanical function (e.g penetration resistance and biomechanical flexibility). Here, we investigate the role of macrocomposite structure, composition, geometric orientation, and spatial distribution in a flexible model natural armor system present in the majority of teleost fish species. Physical multi-material composite models are fabricated using a combination of 3-D printing and molding methods. Mechanical experiments using digital image correlation enable measurement of both the macroscopic response and underlying deformation mechanisms during various loading scenarios. Finite element-based mechanical models yield detailed insights into the roles and the tradeoffs of the composite structure providing constraint, shear, and bending mechanisms to impart protection and flexibility.

  19. Aligning Goal and Value Models for Information System Design

    NASA Astrophysics Data System (ADS)

    Edirisuriya, Ananda; Zdravkovic, Jelena

    The success of process-aware information systems and web services heavily depends on their ability to work as catalysts for the business values that are being exchanged in a business model. The motivation of a business model can be found in the goals of an enterprise which are made explicit in a goal model. From the IT perspective, goal and business models form part of a chain of models, ending with an information system model. Thereby, analyzing and establishing the alignment of business models with goal models is a starting task on the way to a business-aware information system. This paper discusses the alignment of value-based business models with system-oriented goal models. The result is a set of transformation rules between the two models. A case study from the health sector is used to argument the way we ground and apply our contribution.

  20. Teaching Improvement Model Designed with DEA Method and Management Matrix

    ERIC Educational Resources Information Center

    Montoneri, Bernard

    2014-01-01

    This study uses student evaluation of teachers to design a teaching improvement matrix based on teaching efficiency and performance by combining management matrix and data envelopment analysis. This matrix is designed to formulate suggestions to improve teaching. The research sample consists of 42 classes of freshmen following a course of English…

  1. A Model for Peer Review in Instructional Design.

    ERIC Educational Resources Information Center

    Casey, Carl; And Others

    1996-01-01

    Describes an instructional design review process in which peers offer feedback through "structured walkthroughs." Discusses types of reviews, guidelines, and success factors, and summarizes a formal peer review structure developed and tested at Hewlett-Packard Company. Two tables present advantages and disadvantages of types of design reviews and…

  2. Validation of mission critical software design and implementation using model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Mikk, E.; Holzmann, G.; Smith, M.; Dams, D.

    2002-01-01

    Model Checking conducts an exhaustive exploration of all possible behaviors of a software system design and as such can be used to detect defects in designs that are typically difficult to discover with conventional testing approaches.

  3. A new model for broadband waveguide to microstrip transition design

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Downey, Alan N.

    1986-01-01

    A new model is presented which permits the prediction of the resonant frequencies created by antipodal finline waveguide to microstrip transitions. The transition is modeled as a tapered transmission line in series with an infinite set of coupled resonant circuits. The resonant circuits are modeled as simple microwave resonant cavities of which the resonant frequencies are easily determined. The model is developed and the resonant frequencies determined for several different transitions. Experimental results are given to confirm the models.

  4. A Living-Systems Design Model for Web-based Knowledge Management Systems.

    ERIC Educational Resources Information Center

    Plass, Jan L.; Salisbury, Mark W.

    2002-01-01

    Reviews currently available instructional systems design models and describes a new design model for Web-based knowledge management (KM) systems, based on a living-systems approach, and the mechanisms it contains for accommodating change and growth. Illustrates the application of the phases of the model in the development of a KM system with…

  5. SSC dipole log manget model cryostat design and initial production experience

    SciTech Connect

    Niemann, R.C.; Carson, J.A.; Engler, N.H.; Gonczy, J.D.; Nicol, T.H.

    1986-06-01

    The SSC dipole magnet development program includes the design and construction of full length magnet models for heat leak and magnetic measurements and for the evaluation of the performance of strings of magnets. The design of the model magnet cryostat is presented and the production experiences for the initial long magnet model, a heat leak measurement device, are related.

  6. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    ERIC Educational Resources Information Center

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  7. Distance Education for the Gifted and Talented: An Interactive Design Model.

    ERIC Educational Resources Information Center

    McKinnon, David H.; Nolan, C. J. Patrick

    1999-01-01

    Discusses development of an Australian distance-education cosmology course that employs an interactive design model and an extensive communication system. The way the model is used to organized, sequence, and deliver the course is explained. Discussion addresses how the model might be used to design other courses. (Author/CR)

  8. Design-Comparable Effect Sizes in Multiple Baseline Designs: A General Modeling Framework

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Hedges, Larry V.; Shadish, William R.

    2014-01-01

    In single-case research, the multiple baseline design is a widely used approach for evaluating the effects of interventions on individuals. Multiple baseline designs involve repeated measurement of outcomes over time and the controlled introduction of a treatment at different times for different individuals. This article outlines a general…

  9. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  10. Design, experimentation, and modeling of a novel continuous biodrying process

    NASA Astrophysics Data System (ADS)

    Navaee-Ardeh, Shahram

    Massive production of sludge in the pulp and paper industry has made the effective sludge management increasingly a critical issue for the industry due to high landfill and transportation costs, and complex regulatory frameworks for options such as sludge landspreading and composting. Sludge dewatering challenges are exacerbated at many mills due to improved in-plant fiber recovery coupled with increased production of secondary sludge, leading to a mixed sludge with a high proportion of biological matter which is difficult to dewater. In this thesis, a novel continuous biodrying reactor was designed and developed for drying pulp and paper mixed sludge to economic dry solids level so that the dried sludge can be economically and safely combusted in a biomass boiler for energy recovery. In all experimental runs the economic dry solids level was achieved, proving the process successful. In the biodrying process, in addition to the forced aeration, the drying rates are enhanced by biological heat generated through the microbial activity of mesophilic and thermophilic microorganisms naturally present in the porous matrix of mixed sludge. This makes the biodrying process more attractive compared to the conventional drying techniques because the reactor is a self-heating process. The reactor is divided into four nominal compartments and the mixed sludge dries as it moves downward in the reactor. The residence times were 4-8 days, which are 2-3 times shorter than the residence times achieved in a batch biodrying reactor previously studied by our research group for mixed sludge drying. A process variable analysis was performed to determine the key variable(s) in the continuous biodrying reactor. Several variables were investigated, namely: type of biomass feed, pH of biomass, nutrition level (C/N ratio), residence times, recycle ratio of biodried sludge, and outlet relative humidity profile along the reactor height. The key variables that were identified in the continuous

  11. Structural modelling and control design under incomplete parameter information: The maximum-entropy approach

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.

    1983-01-01

    A stochastic structural control model is described. In contrast to the customary deterministic model, the stochastic minimum data/maximum entropy model directly incorporates the least possible a priori parameter information. The approach is to adopt this model as the basic design model, thus incorporating the effects of parameter uncertainty at a fundamental level, and design mean-square optimal controls (that is, choose the control law to minimize the average of a quadratic performance index over the parameter ensemble).

  12. High Altitude Venus Operations Concept Trajectory Design, Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael A.; Ozoroski, Thomas A.; Van Norman, John W.; Arney, Dale C.; Dec, John A.; Jones, Christopher A.; Zumwalt, Carlie H.

    2015-01-01

    A trajectory design and analysis that describes aerocapture, entry, descent, and inflation of manned and unmanned High Altitude Venus Operation Concept (HAVOC) lighter-than-air missions is presented. Mission motivation, concept of operations, and notional entry vehicle designs are presented. The initial trajectory design space is analyzed and discussed before investigating specific trajectories that are deemed representative of a feasible Venus mission. Under the project assumptions, while the high-mass crewed mission will require further research into aerodynamic decelerator technology, it was determined that the unmanned robotic mission is feasible using current technology.

  13. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.

  14. Dynamic modeling, property investigation, and adaptive controller design of serial robotic manipulators modeled with structural compliance

    NASA Technical Reports Server (NTRS)

    Tesar, Delbert; Tosunoglu, Sabri; Lin, Shyng-Her

    1990-01-01

    Research results on general serial robotic manipulators modeled with structural compliances are presented. Two compliant manipulator modeling approaches, distributed and lumped parameter models, are used in this study. System dynamic equations for both compliant models are derived by using the first and second order influence coefficients. Also, the properties of compliant manipulator system dynamics are investigated. One of the properties, which is defined as inaccessibility of vibratory modes, is shown to display a distinct character associated with compliant manipulators. This property indicates the impact of robot geometry on the control of structural oscillations. Example studies are provided to illustrate the physical interpretation of inaccessibility of vibratory modes. Two types of controllers are designed for compliant manipulators modeled by either lumped or distributed parameter techniques. In order to maintain the generality of the results, neither linearization is introduced. Example simulations are given to demonstrate the controller performance. The second type controller is also built for general serial robot arms and is adaptive in nature which can estimate uncertain payload parameters on-line and simultaneously maintain trajectory tracking properties. The relation between manipulator motion tracking capability and convergence of parameter estimation properties is discussed through example case studies. The effect of control input update delays on adaptive controller performance is also studied.

  15. Modeling Applications to Inform Hydromodification Management Design Decisions

    NASA Astrophysics Data System (ADS)

    Goodman, J.

    2013-12-01

    Hydromodification is defined as changes in runoff characteristics and in-stream processes caused by altered land use. The impact of hydromodification can manifest itself through adjustment of stream morphology via channel incision, widening, planform alteration, or coarsening of the bed material. The state of the practice for hydromodification management in California and Western Washington for new and re-development has been to mimic pre-development site hydrology. The theory is that if the pre-development distribution of in-stream flows is maintained, then the baseline capacity to transport sediment, a proxy for the geomorphic condition, will be maintained as well. A popular method of mimicking the pre-development flow regime is by maintaining the pre-development frequency distribution of runoff, known as flow duration control. This can be done by routing post-development runoff through structural stormwater facilities (BMPs) such that runoff is stored and slowly released to match pre-development flow duration characteristics. As it turns out, storage requirements for hydromodification control tend to be much larger than that for surface water treatment requirements (see nomograph). As regulatory requirements for hydromodification evolve and begin to spread to other parts of the country, it is necessary that scientists, water resources professionals, and policy makers understand the practical challenges of implementing hydromodification controls, including the sizing and cost constraints, and know about innovations which could make hydromodification controls more feasible to implement. In an effort to provide the audience with this better understanding, this presentation will share a step-by-step approach for predicting long-term hydromodification impacts; demonstrate options for mitigating these impacts within the context of the modeling approach; and discuss sizing sensitivities of LID-type hydromodification control structural BMPs as a function of performance

  16. Inverse problems in the design, modeling and testing of engineering systems

    NASA Technical Reports Server (NTRS)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  17. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  18. Science Yield Modeling with EXOSIMS

    NASA Astrophysics Data System (ADS)

    Garrett, Daniel; Savransky, Dmitry

    2016-01-01

    Accurately modeling science yield of an exoplanet direct imaging mission to build confidence in the achievement of science goals can be almost as complicated as designing the mission itself. It is challenging to compare science simulation results and systematically test the effects of changing instrument or mission designs. EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) addresses this by generating ensembles of mission simulations for exoplanet direct imaging missions to estimate distributions of science yield. EXOSIMS consists of stand-alone modules written in Python which may be individually modified without requiring modifications to the code elsewhere. This structure allows for user driven systemic exploration of the effects of changing designs on the estimated science yield.The modules of EXOSIMS are classified as either input or simulation modules. Input modules contain specific mission design parameters and functions. These include Planet Population, Star Catalog, Optical System, Zodiacal Light, Planet Physical Model, Observatory, Time Keeping, and Post-Processing. Simulation modules perform tasks requiring input from one or more input modules as well as calling functions from other simulation modules. These include Completeness, Target List, Simulated Universe, Survey Simulation, and Survey Ensemble. The required parameters and functionality of each of these modules is defined in the documentation for EXOSIMS.EXOSIMS is available to the public at https://github.com/dsavransky/EXOSIMS. Included in the documentation is an interface control document which defines the required inputs and outputs to each input and simulation module. Future development of EXOSIMS is intended to be community-driven. Mission planners and instrument designers may quickly write their own modules, following the guidelines in the interface control document, and drop them directly into the code without making additional modifications elsewhere. It is expected that EXOSIMS

  19. Design and Test of Fan/Nacelle Models Quiet High-Speed Fan Design

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J. (Technical Monitor); Repp, Russ; Gentile, David; Hanson, David; Chunduru, Srinivas

    2003-01-01

    The primary objective of the Quiet High-Speed Fan (QHSF) program was to develop an advanced high-speed fan design that will achieve a 6 dB reduction in overall fan noise over a baseline configuration while maintaining similar performance. The program applies and validates acoustic, aerodynamic, aeroelastic, and mechanical design tools developed by NASA, US industry, and academia. The successful fan design will be used in an AlliedSignal Engines (AE) advanced regional engine to be marketed in the year 2000 and beyond. This technology is needed to maintain US industry leadership in the regional turbofan engine market.

  20. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  1. Design protocols and analytical strategies that incorporate structural reliability models

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.

    1995-01-01

    In spite of great improvements in accuracy through the use of computers, design methods, which can be equally critical in establishing the commercial success of a material, have been treated as afterthoughts. Early investment in design and development technologies can easily reduce manufacturing costs later in the product cycle. To avoid lengthy product development times for ceramic composites, funding agencies for materials research must commit resources to support design and development technologies early in the material life cycle. These technologies need not focus on designing the material, rather, the technology must focus on designing with the material, i. e., developing methods to design components fabricated from the new material. Thus a basic tenet that motivated this research effort is that a persistent need exists for improvements in the analysis of components fabricated from CMC material systems. From an aerospace design engineer's perspective the new generation of ceramic composites offers a significant potential for raising the thrust/weight ratio and reducing NOx emissions of gas turbine engines. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for 'graceful' rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Thus any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Establishing design protocols that enable the engineer to analyze and predict this type of behavior in ceramic composites was the general goal of this project.

  2. Design of a Model of Knee Joint for Educational Purposes

    ERIC Educational Resources Information Center

    Jastaniah, Saddig; Alganmi, Ohud

    2016-01-01

    Uses of models play an important role by simulating the bone, obviating the need to experiment on humans or animals. The aim of the present study was to access local materials as gypsum and wax is to be tested for performing a knee model matching bone in the density also to explore how students can come to understand function through a model-based…

  3. Design and Development of a Microscopic Model for Polarization

    ERIC Educational Resources Information Center

    Petridou, E.; Psillos, D.; Hatzikraniotis, E.; Viiri, J.

    2009-01-01

    As research shows that the knowledge and use of models and modelling by teachers is limited, particularly for predicting phenomena, we developed and applied a sequence of three representations of a simulated model focusing on polarization and specifically showing the behaviour of an atom, and forces exerted on a dipole and an insulator, when a…

  4. Challenges in Educational Modelling: Expressiveness of IMS Learning Design

    ERIC Educational Resources Information Center

    Caeiro-Rodriguez, Manuel; Anido-Rifon, Luis; Llamas-Nistal, Martin

    2010-01-01

    Educational Modelling Languages (EMLs) have been proposed to enable the authoring of models of "learning units" (e.g., courses, lessons, lab practices, seminars) covering the broad variety of pedagogical approaches. In addition, some EMLs have been proposed as computational languages that support the processing of learning unit models by…

  5. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  6. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    NASA Technical Reports Server (NTRS)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  7. Analytical models for use in fan inflow control structure design. Inflow distortion and acoustic transmission models

    NASA Technical Reports Server (NTRS)

    Gedge, M. R.

    1979-01-01

    Analytical models were developed to study the effect of flow contraction and screening on inflow distortions to identify qualitative design criteria. Results of the study are that: (1) static testing distortions are due to atmospheric turbulence, nacelle boundary layer, exhaust flow reingestion, flow over stand, ground plane, and engine casing; (2) flow contraction suppresses, initially, turbulent axial velocity distortions and magnifies turbulent transverse velocity distortions; (3) perforated plate and gauze screens suppress axial components of velocity distortions to a degree determined by the screen pressure loss coefficient; (4) honeycomb screen suppress transverse components of velocity distortions to a degree determined by the length to diameter ratio of the honeycomb; (5) acoustic transmission loss of perforated plate is controlled by the reactance of its acoustic impedance; (6) acoustic transmission loss of honeycomb screens is negligible; and (7) a model for the direction change due to a corner between honeycomb panels compares favorably with measured data.

  8. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work sponsored by JPL and other organizations to develop a unified control/structures modeling and design capability for large space structures is presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. The development of a methodology for global design optimization is recommended as a long term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization. Recommendations are also presented for near term research activities at JPL. The key recommendation is to continue the development of integrated dynamic modeling/control design techniques, with special attention given to the development of structural models specially tailored to support design.

  9. Improving variable-fidelity modelling by exploring global design space and radial basis function networks for aerofoil design

    NASA Astrophysics Data System (ADS)

    Tyan, Maxim; Van Nguyen, Nhu; Lee, Jae-Woo

    2015-07-01

    The global variable-fidelity modelling (GVFM) method presented in this article extends the original variable-complexity modelling (VCM) algorithm that uses a low-fidelity and scaling function to approximate a high-fidelity function for efficiently solving design-optimization problems. GVFM uses the design of experiments to sample values of high- and low-fidelity functions to explore global design space and to initialize a scaling function using the radial basis function (RBF) network. This approach makes it possible to remove high-fidelity-gradient evaluation from the process, which makes GVFM more efficient than VCM for high-dimensional design problems. The proposed algorithm converges with 65% fewer high-fidelity function calls for a one-dimensional problem than VCM and approximately 80% fewer for a two-dimensional numerical problem. The GVFM method is applied for the design optimization of transonic and subsonic aerofoils. Both aerofoil design problems show design improvement with a reasonable number of high- and low-fidelity function evaluations.

  10. Design for and efficient dynamic climate model with realistic geography

    NASA Technical Reports Server (NTRS)

    Suarez, M. J.; Abeles, J.

    1984-01-01

    The long term climate sensitivity which include realistic atmospheric dynamics are severely restricted by the expense of integrating atmospheric general circulation models are discussed. Taking as an example models used at GSFC for this dynamic model is an alternative which is of much lower horizontal or vertical resolution. The model of Heid and Suarez uses only two levels in the vertical and, although it has conventional grid resolution in the meridional direction, horizontal resolution is reduced by keeping only a few degrees of freedom in the zonal wavenumber spectrum. Without zonally asymmetric forcing this model simulates a day in roughly 1/2 second on a CRAY. The model under discussion is a fully finite differenced, zonally asymmetric version of the Heid-Suarez model. It is anticipated that speeds can be obtained a few seconds a day roughly 50 times faster than moderate resolution, multilayer GCM's.

  11. Sarnoff JND Vision Model for Flat-Panel Design

    NASA Technical Reports Server (NTRS)

    Brill, Michael H.; Lubin, Jeffrey

    1998-01-01

    This document describes adaptation of the basic Sarnoff JND Vision Model created in response to the NASA/ARPA need for a general-purpose model to predict the perceived image quality attained by flat-panel displays. The JND model predicts the perceptual ratings that humans will assign to a degraded color-image sequence relative to its nondegraded counterpart. Substantial flexibility is incorporated into this version of the model so it may be used to model displays at the sub-pixel and sub-frame level. To model a display (e.g., an LCD), the input-image data can be sampled at many times the pixel resolution and at many times the digital frame rate. The first stage of the model downsamples each sequence in time and in space to physiologically reasonable rates, but with minimum interpolative artifacts and aliasing. Luma and chroma parts of the model generate (through multi-resolution pyramid representation) a map of differences-between test and reference called the JND map, from which a summary rating predictor is derived. The latest model extensions have done well in calibration against psychophysical data and against image-rating data given a CRT-based front-end. THe software was delivered to NASA Ames and is being integrated with LCD display models at that facility,

  12. Cognitive behavioral game design: a unified model for designing serious games

    PubMed Central

    Starks, Katryna

    2014-01-01

    Video games have a unique ability to engage, challenge, and motivate, which has led teachers, psychology specialists, political activists and health educators to find ways of using them to help people learn, grow and change. Serious games, as they are called, are defined as games that have a primary purpose other than entertainment. However, it is challenging to create games that both educate and entertain. While game designers have embraced some psychological concepts such as flow and mastery, understanding how these concepts work together within established psychological theory would assist them in creating effective serious games. Similarly, game design professionals have understood the propensity of video games to teach while lamenting that educators do not understand how to incorporate educational principles into game play in a way that preserves the entertainment. Bandura (2006) social cognitive theory (SCT) has been used successfully to create video games that create positive behavior outcomes, and teachers have successfully used Gardner’s (1983) theory of multiple intelligences (MIs) to create engaging, immersive learning experiences. Cognitive behavioral game design is a new framework that incorporates SCT and MI with game design principles to create a game design blueprint for serious games. PMID:24550858

  13. Cognitive behavioral game design: a unified model for designing serious games.

    PubMed

    Starks, Katryna

    2014-01-01

    Video games have a unique ability to engage, challenge, and motivate, which has led teachers, psychology specialists, political activists and health educators to find ways of using them to help people learn, grow and change. Serious games, as they are called, are defined as games that have a primary purpose other than entertainment. However, it is challenging to create games that both educate and entertain. While game designers have embraced some psychological concepts such as flow and mastery, understanding how these concepts work together within established psychological theory would assist them in creating effective serious games. Similarly, game design professionals have understood the propensity of video games to teach while lamenting that educators do not understand how to incorporate educational principles into game play in a way that preserves the entertainment. Bandura (2006) social cognitive theory (SCT) has been used successfully to create video games that create positive behavior outcomes, and teachers have successfully used Gardner's (1983) theory of multiple intelligences (MIs) to create engaging, immersive learning experiences. Cognitive behavioral game design is a new framework that incorporates SCT and MI with game design principles to create a game design blueprint for serious games.

  14. Cognitive behavioral game design: a unified model for designing serious games.

    PubMed

    Starks, Katryna

    2014-01-01

    Video games have a unique ability to engage, challenge, and motivate, which has led teachers, psychology specialists, political activists and health educators to find ways of using them to help people learn, grow and change. Serious games, as they are called, are defined as games that have a primary purpose other than entertainment. However, it is challenging to create games that both educate and entertain. While game designers have embraced some psychological concepts such as flow and mastery, understanding how these concepts work together within established psychological theory would assist them in creating effective serious games. Similarly, game design professionals have understood the propensity of video games to teach while lamenting that educators do not understand how to incorporate educational principles into game play in a way that preserves the entertainment. Bandura (2006) social cognitive theory (SCT) has been used successfully to create video games that create positive behavior outcomes, and teachers have successfully used Gardner's (1983) theory of multiple intelligences (MIs) to create engaging, immersive learning experiences. Cognitive behavioral game design is a new framework that incorporates SCT and MI with game design principles to create a game design blueprint for serious games. PMID:24550858

  15. Design and Modeling of Pulsed Power Accelerators Via Circuit Analysis

    1996-12-05

    SCREAMER simulates electrical circuits which may contain elements of variable resistance, capacitance and inductance. The user may add variable circuit elements in a simulation by choosing from a library of models or by writing a subroutine describing the element. Transmission lines, magnetically insulated transmission lines (MITLs) and arbitrary voltage and current sources may also be included. Transmission lines are modeled using pi-sections connected in series. Many models of switches and loads are included.

  16. An Ecological Perspective and Model for Campus Design

    ERIC Educational Resources Information Center

    Banning, James H.; Kaiser, Leland

    1974-01-01

    The authors introduce the concept of "ecosystems." An ecosystem is one in which there is a true transaction between mutually dependent partners, with the assumption on college campuses that either may change so that mutual benefit may result. A model for bringing about change is presented, and methodology for using the model is described.…

  17. Designing Sensor Networks by a Generalized Highly Optimized Tolerance Model

    NASA Astrophysics Data System (ADS)

    Miyano, Takaya; Yamakoshi, Miyuki; Higashino, Sadanori; Tsutsui, Takako

    A variant of the highly optimized tolerance model is applied to a toy problem of bioterrorism to determine the optimal arrangement of hypothetical bio-sensors to avert epidemic outbreak. Nonlinear loss function is utilized in searching the optimal structure of the sensor network. The proposed method successfully averts disastrously large events, which can not be achieved by the original highly optimized tolerance model.

  18. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  19. Comparison between Kemp, Smith & Ragan, Dick & Carey's Instructional Design Models

    ERIC Educational Resources Information Center

    Birgili, Bengi

    2013-01-01

    Instructional design (ID) is systematic way of suggesting a structure and giving meaning to an instructional problem by helping to visualize the problem and breaking into discrete and manageable units. In addition, ID is a systematic reflective process of applying instructional principles into plans by material, activity, resources and evaluation…

  20. Teaching Construction: A Design-Based Course Model

    ERIC Educational Resources Information Center

    Love, Tyler S.; Salgado, Carlos A.

    2016-01-01

    The focus on construction in T&E education has drastically changed. This article presents a series of topics and design-based labs that can be taught at various grade levels to integrate STEM concepts while also increasing students' overall awareness of construction and structural technologies.

  1. Designing Online Workshops: Using an Experiential Learning Model

    ERIC Educational Resources Information Center

    Lynch, Sherry K.; Kogan, Lori R.

    2004-01-01

    This article describes 4 online workshops designed to assist college students with improving their time management, textbook reading, memory and concentration, and overall academic performance. These workshops were created to work equally well with imaginative, analytic, common-sense, and dynamic learners. Positive student feedback indicated that…

  2. Applying Learning Theories and Instructional Design Models for Effective Instruction

    ERIC Educational Resources Information Center

    Khalil, Mohammed K.; Elkhider, Ihsan A.

    2016-01-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…

  3. A Model for Designing Library Instruction for Distance Learning

    ERIC Educational Resources Information Center

    Rand, Angela Doucet

    2013-01-01

    Providing library instruction in distance learning environments presents a unique set of challenges for instructional librarians. Innovations in computer-mediated communication and advances in cognitive science research provide the opportunity for designing library instruction that meets a variety of student information seeking needs. Using a…

  4. Exploring an Appropriate Instructional Design Model for Continuing Medical Education

    ERIC Educational Resources Information Center

    Omrani, Soghra; Fardanesh, Hashem; Hemmati, Nima; Hemmati, Naser

    2012-01-01

    Instruction, even when designed and based on sound instructional principles, oftentimes does not stimulate learners' motivation to learn. The result may be that learners may not be motivated to pursue lifelong learning and use the knowledge and skills learned to deliver patient care. The purpose of this study was to identify an appropriate…

  5. High School Student Modeling in the Engineering Design Process

    ERIC Educational Resources Information Center

    Mentzer, Nathan; Huffman, Tanner; Thayer, Hilde

    2014-01-01

    A diverse group of 20 high school students from four states in the US were individually provided with an engineering design challenge. Students chosen were in capstone engineering courses and had taken multiple engineering courses. As students considered the problem and developed a solution, observational data were recorded and artifacts…

  6. Wisconsin's Model Academic Standards for Art and Design Education.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    This Wisconsin academic standards guide for art and design explains what is meant by academic standards. The guide declares that academic standards specify what students should know and be able to do; what students might be asked to do to give evidence of standards; how well students must perform; and that content, performance, and proficiency…

  7. Design Optimization of Coronary Stent Based on Finite Element Models

    PubMed Central

    Qiu, Tianshuang; Zhu, Bao; Wu, Jinying

    2013-01-01

    This paper presents an effective optimization method using the Kriging surrogate model combing with modified rectangular grid sampling to reduce the stent dogboning effect in the expansion process. An infilling sampling criterion named expected improvement (EI) is used to balance local and global searches in the optimization iteration. Four commonly used finite element models of stent dilation were used to investigate stent dogboning rate. Thrombosis models of three typical shapes are built to test the effectiveness of optimization results. Numerical results show that two finite element models dilated by pressure applied inside the balloon are available, one of which with the artery and plaque can give an optimal stent with better expansion behavior, while the artery and plaque unincluded model is more efficient and takes a smaller amount of computation. PMID:24222743

  8. A Bayesian Chance-Constrained Method for Hydraulic Barrier Design Under Model Structure Uncertainty

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Pham, H. V.; Tsai, F. T. C.

    2014-12-01

    The groundwater community has widely recognized the model structure uncertainty as the major source of model uncertainty in groundwater modeling. Previous studies in the aquifer remediation design, however, rarely discuss the impact of the model structure uncertainty. This study combines the chance-constrained (CC) programming with the Bayesian model averaging (BMA) as a BMA-CC framework to assess the effect of model structure uncertainty in the remediation design. To investigate the impact of the model structure uncertainty on the remediation design, we compare the BMA-CC method with the traditional CC programming that only considers the model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from saltwater intrusion in the "1,500-foot" sand and the "1-700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address the model structure uncertainty, we develop three conceptual groundwater models based on three different hydrostratigraphy structures. The results show that using the traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from connector wells is higher than the total pumpage of the protected public supply wells. While reducing injection rate can be achieved by reducing reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station is not economically attractive.

  9. Preparing e-Learning Designers Using Kolb's Model of Experiential Learning

    ERIC Educational Resources Information Center

    Dunlap, Joanna; Dobrovolny, Jackie; Young, Dave

    2008-01-01

    In this article, Joanna Dunlap, Jackie Dobrovolny, and David Young describe their approach to the design of a real-world learning experience that prepares online graduate students to work as e-learning designers and specialists. Using Kolb's model of experiential learning to support their instructional design decisions, Dunlap, Dobrovolny, and…

  10. Enhancing an Instructional Design Model for Virtual Reality-Based Learning

    ERIC Educational Resources Information Center

    Chen, Chwen Jen; Teh, Chee Siong

    2013-01-01

    In order to effectively utilize the capabilities of virtual reality (VR) in supporting the desired learning outcomes, careful consideration in the design of instruction for VR learning is crucial. In line with this concern, previous work proposed an instructional design model that prescribes instructional methods to guide the design of VR-based…

  11. Development Design Model of Academic Quality Assurance at Private Islamic University Jakarta Indonesia

    ERIC Educational Resources Information Center

    Suprihatin, Krebet; Bin Mohamad Yusof, Hj. Abdul Raheem

    2015-01-01

    This study aims to evaluate the practice of academic quality assurance in design model based on seven aspects of quality are: curriculum design, teaching and learning, student assessment, student selection, support services, learning resources, and continuous improvement. The design study was conducted in two stages. The first stage is to obtain…

  12. Conversations around Design Sketches: Use of Communication Channels for Sharing Mental Models during Concept Generation

    ERIC Educational Resources Information Center

    Ariff, Nik Shahman Nik Ahmad; Badke-Schaub, Petra; Eris, Ozgur

    2012-01-01

    In this paper, we present an exploratory protocol study on the use of different communication channels during design sketching. We focus on how individual designers share their mental models with other designers in a group, and analyse their use of graphical, textual, and verbal communications during concept generation. Our findings suggest that…

  13. Designing Collaborative E-Learning Environments Based upon Semantic Wiki: From Design Models to Application Scenarios

    ERIC Educational Resources Information Center

    Li, Yanyan; Dong, Mingkai; Huang, Ronghuai

    2011-01-01

    The knowledge society requires life-long learning and flexible learning environment that enables fast, just-in-time and relevant learning, aiding the development of communities of knowledge, linking learners and practitioners with experts. Based upon semantic wiki, a combination of wiki and Semantic Web technology, this paper designs and develops…

  14. Time domain and frequency domain design techniques for model reference adaptive control systems

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III

    1971-01-01

    Some problems associated with the design of model-reference adaptive control systems are considered and solutions to these problems are advanced. The stability of the adapted system is a primary consideration in the development of both the time-domain and the frequency-domain design techniques. Consequentially, the use of Liapunov's direct method forms an integral part of the derivation of the design procedures. The application of sensitivity coefficients to the design of model-reference adaptive control systems is considered. An application of the design techniques is also presented.

  15. Computer-aided design of curved surfaces with automatic model generation

    NASA Technical Reports Server (NTRS)

    Staley, S. M.; Jerard, R. B.; White, P. R.

    1980-01-01

    The design and visualization of three-dimensional objects with curved surfaces have always been difficult. The paper given below describes a computer system which facilitates both the design and visualization of such surfaces. The system enhances the design of these surfaces by virtue of various interactive techniques coupled with the application of B-Spline theory. Visualization is facilitated by including a specially built model-making machine which produces three-dimensional foam models. Thus, the system permits the designer to produce an inexpensive model of the object which is suitable for evaluation and presentation.

  16. Design-oriented analytic model of phase and frequency modulated optical links

    NASA Astrophysics Data System (ADS)

    Monsurrò, Pietro; Saitto, Antonio; Tommasino, Pasquale; Trifiletti, Alessandro; Vannucci, Antonello; Cimmino, Rosario F.

    2016-07-01

    An analytic design-oriented model of phase and frequency modulated microwave optical links has been developed. The models are suitable for design of broadband high dynamic range optical links for antenna remoting and optical beamforming, where noise and linearity of the subsystems are a concern Digital filter design techniques have been applied to the design of optical filters working as frequency discriminator, that are the bottleneck in terms of linearity for these systems. The models of frequency modulated, phase modulated, and coherent I/Q link have been used to compare performance of the different architectures in terms of linearity and SFDR.

  17. Life Modeling and Design Analysis for Ceramic Matrix Composite Materials

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The primary research efforts focused on characterizing and modeling static failure, environmental durability, and creep-rupture behavior of two classes of ceramic matrix composites (CMC), silicon carbide fibers in a silicon carbide matrix (SiC/SiC) and carbon fibers in a silicon carbide matrix (C/SiC). An engineering life prediction model (Probabilistic Residual Strength model) has been developed specifically for CMCs. The model uses residual strength as the damage metric for evaluating remaining life and is posed probabilistically in order to account for the stochastic nature of the material s response. In support of the modeling effort, extensive testing of C/SiC in partial pressures of oxygen has been performed. This includes creep testing, tensile testing, half life and residual tensile strength testing. C/SiC is proposed for airframe and propulsion applications in advanced reusable launch vehicles. Figures 1 and 2 illustrate the models predictive capabilities as well as the manner in which experimental tests are being selected in such a manner as to ensure sufficient data is available to aid in model validation.

  18. Designing a Model of Vocational Training Programs for Disables through ODL

    ERIC Educational Resources Information Center

    Majid, Shaista; Razzak, Adeela

    2015-01-01

    This study was conducted to designing a model of vocational training programs for disables. For this purpose desk review was carried out and the vocational training models/programs of Israel, U.K., Vietnam, Japan and Thailand were analyzed to form a conceptual framework of the model. Keeping in view the local conditions/requirements a model of…

  19. Communications, Navigation, and Surveillance Models in ACES: Design Implementation and Capabilities

    NASA Technical Reports Server (NTRS)

    Kubat, Greg; Vandrei, Don; Satapathy, Goutam; Kumar, Anil; Khanna, Manu

    2006-01-01

    Presentation objectives include: a) Overview of the ACES/CNS System Models Design and Integration; b) Configuration Capabilities available for Models and Simulations using ACES with CNS Modeling; c) Descriptions of recently added, Enhanced CNS Simulation Capabilities; and d) General Concepts Ideas that Utilize CNS Modeling to Enhance Concept Evaluations.

  20. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    ERIC Educational Resources Information Center

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  1. Using the Constructivist Tridimensional Design Model for Online Continuing Education for Health Care Clinical Faculty

    ERIC Educational Resources Information Center

    Seo, Kay Kyeong-Ju; Engelhard, Chalee

    2014-01-01

    This article presents a new paradigm for continuing education of Clinical Instructors (CIs): the Constructivist Tridimensional (CTD) model for the design of an online curriculum. Based on problem-based learning, self-regulated learning, and adult learning theory, the CTD model was designed to facilitate interactive, collaborative, and authentic…

  2. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  3. The $7,376 "Ivies": Value-Designed Models of Undergraduate Education

    ERIC Educational Resources Information Center

    Fried, Vance H.

    2008-01-01

    Is it possible to get an "Ivy" education for $7,376 a year? Can a college provide high-quality undergraduate education at a reasonable cost? In this paper, the author explores if cost can be reduced and quality improved through the use of new "value-designed" models of undergraduate education. A value-designed model allows one to appeal to…

  4. Rubber airplane: Constraint-based component-modeling for knowledge representation in computer-aided conceptual design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1990-01-01

    Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.

  5. Integrating O/S models during conceptual design, part 2

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    This report documents the procedures for utilizing and maintaining the Reliability & Maintainability Model (RAM) developed by the University of Dayton for the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) under NASA research grant NAG-1-1327. The purpose of the grant is to provide support to NASA in establishing operational and support parameters and costs of proposed space systems. As part of this research objective, the model described here was developed. Additional documentation concerning the development of this model may be found in Part 1 of this report. This is the 2nd part of a 3 part technical report.

  6. Structural modeling for control design (articulated multibody component representation)

    NASA Technical Reports Server (NTRS)

    Haugse, E. D.; Jones, R. E.; Salus, W. L.

    1989-01-01

    High gain, high frequency flexible responses in gimbaled multibody systems are discussed. Their origin and physical significance are described in terms of detailed mass and stiffness modeling at actuator/sensor interfaces. Guyan Reduction, Generalized Dynamic Reduction, inadequate mass modeling detail, as well as system mode truncation, are shown to suppress the high gain high frequency response and thereby lose system flexibility important for stability and performance predictions. Model validation by modal survey testing is shown to risk similar loss of accuracy. Difficulties caused by high frequency responses in component mode simulations, such as DISCOS, and also linearized system mode simulations, are described, and approaches for handling these difficulties are discussed.

  7. Communications network design and costing model technical manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.

  8. A simple model for the design of vertical tube absorbers

    SciTech Connect

    Patnaik, V.; Perez-Blanco, H.; Ryan, W.A.

    1993-08-01

    The absorption of water vapor in aqueous solutions of lithium bromide is modelled for a falling-film, vertical-tube absorber. The model is based on the solution of three ordinary differential equations to calculate solution bulk and interface concentration and temperature distributions and the coolant temperature distribution. The heat and mass transfer coefficients employed in the equations are extracted from the literature. In this way, the model incorporates recent information on wavy-laminar flows. Under certain conditions, the solution exhibits instabilities in the entrance region of the absorber tube, which are corrected by the introduction of a dampening factor incorporating relevant thermophysical properties. The usefulness of the model for generating absorber performance charts is demonstrated.

  9. IT-Supported Modeling, Analysis and Design of Supply Chains

    NASA Astrophysics Data System (ADS)

    Nienhaus, Jörg; Alard, Robert; Sennheiser, Andreas

    A common language is a prerequisite for analyzing and optimizing supply chains. Based on experiences with three case studies, this paper identifies the aspects of a supply chain that have to be mapped to take informed decisions on its operations. Current, integrated modeling approaches for supply chains, like the SCOR and the GSCM model, will be analyzed and an advanced approach will be defined. The resulting approach takes advantage of IT-support.

  10. Design model of computerized personal decision aid for youth: An expert review

    NASA Astrophysics Data System (ADS)

    Sarif, Siti Mahfuzah; Ibrahim, Norfiza; Shiratuddin, Norshuhada

    2016-08-01

    This paper provides a structured review of a design model of a computerized personal decision aid that is intended for youth, named as YouthPDA Design Model. The proposed design model was examined by experts in related areas to ensure the appropriateness of the proposed components and elements, relevancy of the terminologies used, logic of the flow, usability, and practicality of the design model towards development of YouthPDA application. Seven experts from related areas were involved in the evaluation. Discussions on the findings obtained from the expert review are included in this paper. Finally, a revised design model of YouthPDA is proposed as main guidance to develop YouthPDA application.

  11. An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.

  12. Re-designing the PhEDEx Security Model

    SciTech Connect

    Huang, C.-H.; Wildish, T.; Zhang, X.

    2014-01-01

    PhEDEx, the data-placement tool used by the CMS experiment at the LHC, was conceived in a more trusting time. The security model provided a safe environment for site agents and operators, but offerred little more protection than that. Data was not sufficiently protected against loss caused by operator error or software bugs or by deliberate manipulation of the database. Operators were given high levels of access to the database, beyond what was actually needed to accomplish their tasks. This exposed them to the risk of suspicion should an incident occur. Multiple implementations of the security model led to difficulties maintaining code, which can lead to degredation of security over time. In order to meet the simultaneous goals of protecting CMS data, protecting the operators from undue exposure to risk, increasing monitoring capabilities and improving maintainability of the security model, the PhEDEx security model was redesigned and re-implemented. Security was moved from the application layer into the database itself, fine-grained access roles were established, and tools and procedures created to control the evolution of the security model over time. In this paper we describe this work, we describe the deployment of the new security model, and we show how these enhancements improve security on several fronts simultaneously.

  13. Unstable-unit tensegrity plate: modeling and design

    NASA Astrophysics Data System (ADS)

    Zaslavsky, Ron; de Oliveira, Mauricio C.; Skelton, Robert E.

    2003-08-01

    A new topology for a prestressed tensegrity plate, the unstable-unit tensegrity plate (UUTP), is introduced, together with a detailed algorithm for its design. The plate is a truss made of strings (flexible elements) and bars (rigid elements), which are loaded in tension and compression, respectively, where bars do not touch each other. Given the outline dimensions of the desired plate, and the number of bars along the plate's width and length, the algorithm solves for the nodes' positions and the prestress forces that make a plate in equilibrium. This is done by solving a non-linear matrix equation via Newton's method. This equation reflects static equilibrium conditions. We've designed several such plates, proving the feasibility of the proposed topology and the effectiveness of its design algorithm. Two such plates are characterized in detail, both statically and dynamically (via simulation). The proposed algorithm may be extended to solve for other tensegrity structures having different topologies and/or different shapes. The UUTP may be used as a building block of many types of structures, both uncontrolled and controlled, either large-scale or miniature-scale.

  14. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.

    1995-01-01

    The general goal of this project is to establish design protocols that enable the engineer to analyze and predict certain types of behavior in ceramic composites. Sections of the final report addresses the following: Description of the Problem that Motivated the Technology Development, Description of the New Technology that was Developed, Unique and Novel Features of the Technology and Results/Benefits of Application (year by year accomplishments), and Utilization of New Technology in Non-Aerospace Applications. Activities for this reporting period included the development of a design analysis as part of a cooperative agreement with general Electric Aircraft Engines. The effort focused on modifying the Toughened Ceramics Analysis and Reliability Evaluation of Structures (TCARES) algorithm for use in the design of engine components fabricated from NiAl. Other activities related to the development of an ASTM standard practice for estimating Weibull parameters. The standard focuses on the evaluation and reporting of uniaxial strength data, and the estimation of probability distribution parameters for ceramics which fail in a brittle fashion.

  15. Design of Low Complexity Model Reference Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Johnson, Marcus; Nguyen, Nhan

    2012-01-01

    Flight research experiments have demonstrated that adaptive flight controls can be an effective technology for improving aircraft safety in the event of failures or damage. However, the nonlinear, timevarying nature of adaptive algorithms continues to challenge traditional methods for the verification and validation testing of safety-critical flight control systems. Increasingly complex adaptive control theories and designs are emerging, but only make testing challenges more difficult. A potential first step toward the acceptance of adaptive flight controllers by aircraft manufacturers, operators, and certification authorities is a very simple design that operates as an augmentation to a non-adaptive baseline controller. Three such controllers were developed as part of a National Aeronautics and Space Administration flight research experiment to determine the appropriate level of complexity required to restore acceptable handling qualities to an aircraft that has suffered failures or damage. The controllers consist of the same basic design, but incorporate incrementally-increasing levels of complexity. Derivations of the controllers and their adaptive parameter update laws are presented along with details of the controllers implementations.

  16. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  17. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  18. Coupling of digital elevation model and rainfall-runoff model in storm drainage network design

    NASA Astrophysics Data System (ADS)

    Gumbo, Bekithemba; Munyamba, Nelson; Sithole, George; Savenije, Hubert H. G.

    Often planners and engineers are faced with various options and questions in storm drainage network design e.g. flow pattern, direction, runoff quantity and therefore size of drain, or scenario after a road, airfield or building has been constructed. In most instances planning without drainage in mind has caused failure or extensive damage to property including the storm water drains which channel the water away. With the advent of various modelling and geographic information systems (GIS) tools this problem can be averted. The University of Zimbabwe’s (UZ) main campus had its storm drainage network reconstructed at a cost of about US$100 000, because of persistent flooding. This paper describes a method of assessing the effectiveness of storm drainage networks by combining a digital elevation model (DEM) with a rainfall-runoff model based on the Soil Conservation Service South African manual (SCS-SA). The UZ campus was used as the test site. The DEM was generated from aerial photographs and the data imported into ArcView. The 3.0 km 2 basin was then delineated into sub-catchments using ArcView Hydro extension tools. The land-use, watershed and soil map of the UZ were merged in ArcView and initial curve numbers (CN) assigned. Using three years of daily rainfall data, runoff and peak flows were calculated for each sub-catchment. By overlaying the natural flow lines derived from the DEM with the reconstructed physical drains a comparison of the flow direction and the orientation of the drains was achieved. Peak flows where calculated for each delineated watershed and the results used to check the adequacy of the trapezoidal concrete lined drains. A combination of a DEM and rainfall-runoff model within a GIS platform proves to be useful in estimating runoff on partly urbanised watersheds and in determining the size and orientation of storm drains. It is particularly useful for new areas where development is being contemplated.

  19. Design readiness: An exploratory model of object-oriented design performance

    NASA Astrophysics Data System (ADS)

    Lewis, Tracy L.

    The available literature supports the fact that some students experience difficulty learning object-oriented design (OOD) principles. Previously explored predictors of OOD learning difficulties include student characteristics (cognitive activities, self-efficacy), teaching methodologies (teacher centered, course complexity), and student experiences (prior programming experience). Yet, within an extensive body of literature devoted to OOD, two explanations of student difficulty remain largely unexplored: (1) varying conceptualizations of the underlying principles/strategies of OOD, and (2) preparedness or readiness to learn OOD. This research also investigated the extent to which individual differences impacted DRAS and OOD performance. The individual difference measures of interest in this study included college grade point average, prior programming experience, cognitive abilities (spatial orientation, visualization, logical reasoning, flexibility, perceptual style), and design readiness. In addition, OOD performance was measured using two constructs: course grade (exams, labs, programs, overall), and a specially constructed design task. Participants selected from the CS2 course from two southeastern state universities were used within this study, resulting in a sample size of 161 (School A, n = 76; School B, n = 85). School A is a mid-sized comprehensive university and School B is a large research-intensive university. If was found that the schools significantly differed on all measures of prior computer science experience and cognitive abilities. Path analysis was conducted to determine which individual differences were related to design readiness and OOD performance. In summary, this research identified that instructors can not ignore individual differences when teaching OOD. It was found that the cognitive ability visualization, prior OO experience, and overall college grade point average should be considered when teaching OOD. As it stands, without

  20. Materials modeling by design: applications to amorphous solids.

    PubMed

    Biswas, Parthapratim; Tafen, D N; Inam, F; Cai, Bin; Drabold, D A

    2009-02-25

    In this paper, we review a host of methods used to model amorphous materials. We particularly describe methods which impose constraints on the models to ensure that the final model meets a priori requirements (on structure, topology, chemical order, etc). In particular, we review work based on quench from the melt simulations, the 'decorate and relax' method, which is shown to be a reliable scheme for forming models of certain binary glasses. A 'building block' approach is also suggested and yields a pleading model for GeSe(1.5). We also report on the nature of vulcanization in an Se network cross-linked by As, and indicate how introducing H into an a-Si network develops into a-Si:H. We also discuss explicitly constrained methods including reverse Monte Carlo (RMC) and a novel method called 'Experimentally Constrained Molecular Relaxation'. The latter merges the power of ab initio simulation with the ability to impose external information associated with RMC. PMID:21817359