Sample records for existing design tools

  1. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  2. Quasi-experimental study designs series-paper 6: risk of bias assessment.

    PubMed

    Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney

    2017-09-01

    Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Design Automation in Synthetic Biology.

    PubMed

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  4. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  5. Replacement Windows for Existing Homes Homes | Efficient Windows

    Science.gov Websites

    Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Selection Tool will take you through a series of design conditions pertaining to your design and location

  6. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    ERIC Educational Resources Information Center

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-01-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school…

  7. Engineering Effort Needed to Design Spacecraft with Radiation Constraints

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.

    2005-01-01

    A roadmap is articulated that describes what is needed to allow designers, to include researchers, management, and engineers, to investigate, design, build, test, and fly spacecraft that meet the mission requirements yet, be as low cost as possible. This roadmap describes seven levels of tool fidelity and application: 1) Mission Speculation, 2) Management Overview, 3) Mission Design, 4) Detailed Design, 5) Simulation and Training, 6) Operations, and 7) Research. The interfaces and output are described in top-level detail along with the transport engines needed, and deficiencies are noted. This roadmap, if implemented, will allow Multidisciplinary Optimization (MDO) ideas to incorporate radiation concerns. Also, as NASA moves towards Simulation Based Acquisition (SBA), these tools will facilitate the appropriate spending of government money. Most of the tools needed to serve these levels do not exist or exist in pieces and need to be integrated to create the tool.

  8. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    PubMed

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  9. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  10. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  11. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  12. Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.

    DTIC Science & Technology

    1992-05-01

    de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools

  13. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  14. The Design of Technology-Rich Learning Environments as Metacognitive Tools in History Education

    ERIC Educational Resources Information Center

    Poitras, Eric; Lajoie, Susanne; Hong, Yuan-Jin

    2012-01-01

    Research has shown that learners do not always engage in appropriate metacognitive and self-regulatory processes while learning complex historical topics. However, little research exists to guide the design of technology-rich learning environments as metacognitive tools in history education. In order to address this issue, we designed a…

  15. Genetic Design Automation: engineering fantasy or scientific renewal?

    PubMed Central

    Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean

    2013-01-01

    Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068

  16. Tools for Supporting Distributed Agile Project Planning

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  17. Supporting Scientific Modeling Practices in Atmospheric Sciences: Intended and Actual Affordances of a Computer-Based Modeling Tool

    ERIC Educational Resources Information Center

    Wu, Pai-Hsing; Wu, Hsin-Kai; Kuo, Che-Yu; Hsu, Ying-Shao

    2015-01-01

    Computer-based learning tools include design features to enhance learning but learners may not always perceive the existence of these features and use them in desirable ways. There might be a gap between what the tool features are designed to offer (intended affordance) and what they are actually used (actual affordance). This study thus aims at…

  18. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Testing framework for embedded languages

    NASA Astrophysics Data System (ADS)

    Leskó, Dániel; Tejfel, Máté

    2012-09-01

    Embedding a new programming language into an existing one is a widely used technique, because it fastens the development process and gives a part of a language infrastructure for free (e.g. lexical, syntactical analyzers). In this paper we are presenting a new advantage of this development approach regarding to adding testing support for these new languages. Tool support for testing is a crucial point for a newly designed programming language. It could be done in the hard way by creating a testing tool from scratch, or we could try to reuse existing testing tools by extending them with an interface to our new language. The second approach requires less work, and also it fits very well for the embedded approach. The problem is that the creation of such interfaces is not straightforward at all, because the existing testing tools were mostly not designed to be extendable and to be able to deal with new languages. This paper presents an extendable and modular model of a testing framework, in which the most basic design decision was to keep the - previously mentioned - interface creation simple and straightforward. Other important aspects of our model are the test data generation, the oracle problem and the customizability of the whole testing phase.

  20. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  2. A Revised Interface for the ARL Topodef Mobility Design Tool

    DTIC Science & Technology

    2012-04-01

    designed paths as though moving down a conveyor belt . Giving paths an existence independent of the nodes that travel along them not only makes their...A Revised Interface for the ARL Topodef Mobility Design Tool by Andrew J. Toth and Michael Christensen ARL-TR-5980 April 2012...Disclaimers The findings in this report are not to be construed as an official Department of the Army position unless so designated by other

  3. INTEGRATION OF POLLUTION PREVENTION TOOLS

    EPA Science Inventory

    A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...

  4. Designing a Humane Multimedia Interface for the Visually Impaired.

    ERIC Educational Resources Information Center

    Ghaoui, Claude; Mann, M.; Ng, Eng Huat

    2001-01-01

    Promotes the provision of interfaces that allow users to access most of the functionality of existing graphical user interfaces (GUI) using speech. Uses the design of a speech control tool that incorporates speech recognition and synthesis into existing packaged software such as Teletext, the Internet, or a word processor. (Contains 22…

  5. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  6. Centerline latch tool for contingency orbiter door closure

    NASA Technical Reports Server (NTRS)

    Trevino, R. C.

    1982-01-01

    The centerline latch tool was designed and developed as an EVA manual backup device for latching the Space Shuttle Orbiter's payload bay doors for reentry in case of a failure of the existing centerline latches to operate properly. The tool was designed to satisfy a wide variety of structural, mechanical, and EVA requirements. It provides a load path for forces on the payload bay doors during reentry. Since the tool would be used by an EVA crewmember, control, handgrips, operating forces, and procedures must be within the capabilities of a partially restrained, suited crewmember in a zero-gravity environment. The centerline latch tool described was designed, developed, and tested to meet these requirements.

  7. Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code

    NASA Technical Reports Server (NTRS)

    Freeh, Josh

    2003-01-01

    Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.

  8. A Tool for Assessing the Text Legibility of Digital Human Machine Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    2015-08-01

    A tool intended to aid qualified professionals in the assessment of the legibility of text presented on a digital display is described. The assessment of legibility is primarily for the purposes of designing and analyzing human machine interfaces in accordance with NUREG-0700 and MIL-STD 1472G. The tool addresses shortcomings of existing guidelines by providing more accurate metrics of text legibility with greater sensitivity to design alternatives.

  9. Designing Real-time Decision Support for Trauma Resuscitations

    PubMed Central

    Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.

    2016-01-01

    Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010

  10. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  11. PAT: an intelligent authoring tool for facilitating clinical trial design.

    PubMed

    Tagaris, Anastasios; Andronikou, Vassiliki; Karanastasis, Efstathios; Chondrogiannis, Efthymios; Tsirmpas, Charalambos; Varvarigou, Theodora; Koutsouris, Dimitris

    2014-01-01

    Great investments are made by both private and public funds and a wealth of research findings is published, the research and development pipeline phases quite low productivity and tremendous delays. In this paper, we present a novel authoring tool which has been designed and developed for facilitating study design. Its underlying models are based on a thorough analysis of existing clinical trial protocols (CTPs) and eligibility criteria (EC) published in clinicaltrials.gov by domain experts. Moreover, its integration with intelligent decision support services and mechanisms linking the study design process with healthcare patient data as well as its direct access to literature designate it as a powerful tool offering great support to researchers during clinical trial design.

  12. IDSE Version 1 User's Manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  13. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  14. Social Software and Academic Practice: Postgraduate Students as Co-Designers of Web 2.0 Tools

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Burchmore, Helen

    2010-01-01

    In order to develop potentially transformative Web 2.0 tools in higher education, the complexity of existing academic practices, including current patterns of technology use, must be recognised. This paper describes how a series of participatory design activities allowed postgraduate students in education, social sciences and computer sciences to…

  15. [Application of microelectronics CAD tools to synthetic biology].

    PubMed

    Madec, Morgan; Haiech, Jacques; Rosati, Élise; Rezgui, Abir; Gendrault, Yves; Lallement, Christophe

    2017-02-01

    Synthetic biology is an emerging science that aims to create new biological functions that do not exist in nature, based on the knowledge acquired in life science over the last century. Since the beginning of this century, several projects in synthetic biology have emerged. The complexity of the developed artificial bio-functions is relatively low so that empirical design methods could be used for the design process. Nevertheless, with the increasing complexity of biological circuits, this is no longer the case and a large number of computer aided design softwares have been developed in the past few years. These tools include languages for the behavioral description and the mathematical modelling of biological systems, simulators at different levels of abstraction, libraries of biological devices and circuit design automation algorithms. All of these tools already exist in other fields of engineering sciences, particularly in microelectronics. This is the approach that is put forward in this paper. © 2017 médecine/sciences – Inserm.

  16. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  17. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  18. The Challenge Course Experience Questionnaire: A Facilitator's Assessment Tool

    ERIC Educational Resources Information Center

    Schary, David P.; Waldron, Alexis L.

    2017-01-01

    Challenge course programs influence a variety of psychological, social, and educational outcomes. Yet, many challenges exist when measuring challenge course outcomes like logistical constraints and a lack of specific assessment tools. This study piloted and tested an assessment tool designed for facilitators to measure participant outcomes in…

  19. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  20. The potential of genetic algorithms for conceptual design of rotor systems

    NASA Technical Reports Server (NTRS)

    Crossley, William A.; Wells, Valana L.; Laananen, David H.

    1993-01-01

    The capabilities of genetic algorithms as a non-calculus based, global search method make them potentially useful in the conceptual design of rotor systems. Coupling reasonably simple analysis tools to the genetic algorithm was accomplished, and the resulting program was used to generate designs for rotor systems to match requirements similar to those of both an existing helicopter and a proposed helicopter design. This provides a comparison with the existing design and also provides insight into the potential of genetic algorithms in design of new rotors.

  1. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  2. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  3. A method for the design and development of medical or health care information websites to optimize search engine results page rankings on Google.

    PubMed

    Dunne, Suzanne; Cummins, Niamh Maria; Hannigan, Ailish; Shannon, Bill; Dunne, Colum; Cullen, Walter

    2013-08-27

    The Internet is a widely used source of information for patients searching for medical/health care information. While many studies have assessed existing medical/health care information on the Internet, relatively few have examined methods for design and delivery of such websites, particularly those aimed at the general public. This study describes a method of evaluating material for new medical/health care websites, or for assessing those already in existence, which is correlated with higher rankings on Google's Search Engine Results Pages (SERPs). A website quality assessment (WQA) tool was developed using criteria related to the quality of the information to be contained in the website in addition to an assessment of the readability of the text. This was retrospectively applied to assess existing websites that provide information about generic medicines. The reproducibility of the WQA tool and its predictive validity were assessed in this study. The WQA tool demonstrated very high reproducibility (intraclass correlation coefficient=0.95) between 2 independent users. A moderate to strong correlation was found between WQA scores and rankings on Google SERPs. Analogous correlations were seen between rankings and readability of websites as determined by Flesch Reading Ease and Flesch-Kincaid Grade Level scores. The use of the WQA tool developed in this study is recommended as part of the design phase of a medical or health care information provision website, along with assessment of readability of the material to be used. This may ensure that the website performs better on Google searches. The tool can also be used retrospectively to make improvements to existing websites, thus, potentially enabling better Google search result positions without incurring the costs associated with Search Engine Optimization (SEO) professionals or paid promotion.

  4. Anthropometric characteristics of female smallholder farmers of Uganda--Toward design of labor-saving tools.

    PubMed

    Mugisa, Dana J; Katimbo, Abia; Sempiira, John E; Kisaalita, William S

    2016-05-01

    Sub-Saharan African women on small-acreage farms carry a disproportionately higher labor burden, which is one of the main reasons they are unable to produce for both home and the market and realize higher incomes. Labor-saving interventions such as hand-tools are needed to save time and/or increase productivity in, for example, land preparation for crop and animal agriculture, post-harvest processing, and meeting daily energy and water needs. Development of such tools requires comprehensive and content-specific anthropometric data or body dimensions and existing databases based on Western women may be less relevant. We conducted measurements on 89 women to provide preliminary results toward answering two questions. First, how well existing databases are applicable in the design of hand-tools for sub-Saharan African women. Second, how universal body dimension predictive models are among ethnic groups. Our results show that, body dimensions between Bantu and Nilotic ethnolinguistic groups are different and both are different from American women. These results strongly support the need for establishing anthropometric databases for sub-Saharan African women, toward hand-tool design. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. The Changing Role of Instructors in Distance Education: Impact on Tool Support.

    ERIC Educational Resources Information Center

    Biedebach, Anke; Bomsdorf, Birgit; Schlageter, Gunter

    At the university of Hagen a lot of experience exists in performing Web-based teaching and in implementing tools supporting e-learning. To share this knowledge, (inexperienced) instructors more and more ask for tool-based assistance in designing and administrating e-learning courses. Considering experience from other universities, it becomes…

  6. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  7. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  8. Strategic Investment Plan Fiscal Year 1993.

    DTIC Science & Technology

    1993-09-01

    Groundwater ........................ 283 Heavy Metals in Soils, Sludges, Sediments and Water .................... 321 Energetics in Soils and Groundwater...technologies and tools to achieve a design for reconfiguring existing PEP production facilities into agile factories which will reduce total life cycle wastes...facilities. When use of existing facilities is not practical, a special demonstration testbed may be built. The factory design will then be developed

  9. Urban Space Innovation - “10+” Principles through Designing the New Image of the Existing Shopping Mall in Csepel, Hungary

    NASA Astrophysics Data System (ADS)

    Gyergyak, Janos

    2017-10-01

    The first part of the paper is about to introduce the principles of “placemaking” as an innovation and important tool of the cities in the 21st century. The process helps designers to transform the spaces of “nobody” to a community-based space for supporting the connection among humans. The second part of the paper shows the process of the used principles by the author for designing the new image of the existing shopping mall in Csepel, Hungary. This work was selected as one of the best design ideas for renewing the existing underutilized space.

  10. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  11. THE COATINGS GUIDE: AN INTEGRATED TOOL FOR COATINGS DECISIONS

    EPA Science Inventory

    The Coatings Guide, formerly known as the Coatings Alternative Guide (CAGE), is a free Internet pollution prevention tool designed to help small-business coaters of metal and plastic substrates identify alternatives as potential drop-in replacements for existing operations. As sh...

  12. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    NASA Astrophysics Data System (ADS)

    Barr, David R. W.; Dudek, Piotr

    2009-12-01

    We present a software environment for the efficient simulation of cellular processor arrays (CPAs). This software (APRON) is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  13. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  14. Elements of Designing for Cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  15. Elements of designing for cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  16. "PowerUp"!: A Tool for Calculating Minimum Detectable Effect Sizes and Minimum Required Sample Sizes for Experimental and Quasi-Experimental Design Studies

    ERIC Educational Resources Information Center

    Dong, Nianbo; Maynard, Rebecca

    2013-01-01

    This paper and the accompanying tool are intended to complement existing supports for conducting power analysis tools by offering a tool based on the framework of Minimum Detectable Effect Sizes (MDES) formulae that can be used in determining sample size requirements and in estimating minimum detectable effect sizes for a range of individual- and…

  17. Greased Lightning (GL-10) Performance Flight Research: Flight Data Report

    NASA Technical Reports Server (NTRS)

    McSwain, Robert G.; Glaab, Louis J.; Theodore, Colin R.; Rhew, Ray D. (Editor); North, David D. (Editor)

    2017-01-01

    Modern aircraft design methods have produced acceptable designs for large conventional aircraft performance. With revolutionary electronic propulsion technologies fueled by the growth in the small UAS (Unmanned Aerial Systems) industry, these same prediction models are being applied to new smaller, and experimental design concepts requiring a VTOL (Vertical Take Off and Landing) capability for ODM (On Demand Mobility). A 50% sub-scale GL-10 flight model was built and tested to demonstrate the transition from hover to forward flight utilizing DEP (Distributed Electric Propulsion)[1][2]. In 2016 plans were put in place to conduct performance flight testing on the 50% sub-scale GL-10 flight model to support a NASA project called DELIVER (Design Environment for Novel Vertical Lift Vehicles). DELIVER was investigating the feasibility of including smaller and more experimental aircraft configurations into a NASA design tool called NDARC (NASA Design and Analysis of Rotorcraft)[3]. This report covers the performance flight data collected during flight testing of the GL-10 50% sub-scale flight model conducted at Beaver Dam Airpark, VA. Overall the flight test data provides great insight into how well our existing conceptual design tools predict the performance of small scale experimental DEP concepts. Low fidelity conceptual design tools estimated the (L/D)( sub max)of the GL-10 50% sub-scale flight model to be 16. Experimentally measured (L/D)( sub max) for the GL-10 50% scale flight model was 7.2. The aerodynamic performance predicted versus measured highlights the complexity of wing and nacelle interactions which is not currently accounted for in existing low fidelity tools.

  18. PD-atricians: Leveraging Physicians and Participatory Design to Develop Novel Clinical Information Tools

    PubMed Central

    Pollack, Ari H; Miller, Andrew; Mishra, Sonali R.; Pratt, Wanda

    2016-01-01

    Participatory design, a method by which system users and stakeholders meaningfully contribute to the development of a new process or technology, has great potential to revolutionize healthcare technology, yet has seen limited adoption. We conducted a design session with eleven physicians working to create a novel clinical information tool utilizing participatory design methods. During the two-hour session, the physicians quickly engaged in the process and generated a large quantity of information, informing the design of a future tool. By utilizing facilitators experienced in design methodology, with detailed domain expertise, and well integrated into the healthcare organization, the participatory design session engaged a group of users who are often disenfranchised with existing processes as well as health information technology in general. We provide insight into why participatory design works with clinicians and provide guiding principles for how to implement these methods in healthcare organizations interested in advancing health information technology. PMID:28269900

  19. MEMS product engineering: methodology and tools

    NASA Astrophysics Data System (ADS)

    Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer

    2011-03-01

    The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.

  20. Solar water heater design package

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Package describes commercial domestic-hot-water heater with roof or rack mounted solar collectors. System is adjustable to pre-existing gas or electric hot-water house units. Design package includes drawings, description of automatic control logic, evaluation measurements, possible design variations, list of materials and installation tools, and trouble-shooting guide and manual.

  1. The Design of a Multi-Agent NDE Inspection Qualification System

    NASA Astrophysics Data System (ADS)

    McLean, N.; McKenna, J. P.; Gachagan, A.; McArthur, S.; Hayward, G.

    2007-03-01

    A novel Multi-Agent system (MAS) for NDE inspection qualification is being developed to facilitate a scalable environment allowing integration and automation of new and existing inspection qualification tools. This paper discusses the advantages of using a MAS approach to integrate the large number of disparate NDE software tools. The design and implementation of the system architecture is described, including the development of an ontology to describe the NDE domain.

  2. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  3. A Method for the Design and Development of Medical or Health Care Information Websites to Optimize Search Engine Results Page Rankings on Google

    PubMed Central

    Cummins, Niamh Maria; Hannigan, Ailish; Shannon, Bill; Dunne, Colum; Cullen, Walter

    2013-01-01

    Background The Internet is a widely used source of information for patients searching for medical/health care information. While many studies have assessed existing medical/health care information on the Internet, relatively few have examined methods for design and delivery of such websites, particularly those aimed at the general public. Objective This study describes a method of evaluating material for new medical/health care websites, or for assessing those already in existence, which is correlated with higher rankings on Google's Search Engine Results Pages (SERPs). Methods A website quality assessment (WQA) tool was developed using criteria related to the quality of the information to be contained in the website in addition to an assessment of the readability of the text. This was retrospectively applied to assess existing websites that provide information about generic medicines. The reproducibility of the WQA tool and its predictive validity were assessed in this study. Results The WQA tool demonstrated very high reproducibility (intraclass correlation coefficient=0.95) between 2 independent users. A moderate to strong correlation was found between WQA scores and rankings on Google SERPs. Analogous correlations were seen between rankings and readability of websites as determined by Flesch Reading Ease and Flesch-Kincaid Grade Level scores. Conclusions The use of the WQA tool developed in this study is recommended as part of the design phase of a medical or health care information provision website, along with assessment of readability of the material to be used. This may ensure that the website performs better on Google searches. The tool can also be used retrospectively to make improvements to existing websites, thus, potentially enabling better Google search result positions without incurring the costs associated with Search Engine Optimization (SEO) professionals or paid promotion. PMID:23981848

  4. Multidisciplinary Optimization for Aerospace Using Genetic Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Hahn, Edward E.; Herrera, Claudia Y.

    2007-01-01

    In support of the ARMD guidelines NASA's Dryden Flight Research Center is developing a multidisciplinary design and optimization tool This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Optimization has made its way into many mainstream applications. For example NASTRAN(TradeMark) has its solution sequence 200 for Design Optimization, and MATLAB(TradeMark) has an Optimization Tool box. Other packages, such as ZAERO(TradeMark) aeroelastic panel code and the CFL3D(TradeMark) Navier-Stokes solver have no built in optimizer. The goal of the tool development is to generate a central executive capable of using disparate software packages ina cross platform network environment so as to quickly perform optimization and design tasks in a cohesive streamlined manner. A provided figure (Figure 1) shows a typical set of tools and their relation to the central executive. Optimization can take place within each individual too, or in a loop between the executive and the tool, or both.

  5. The nature and evaluation of commercial expert system building tools, revision 1

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1987-01-01

    This memorandum reviews the factors that constitute an Expert System Building Tool (ESBT) and evaluates current tools in terms of these factors. Evaluation of these tools is based on their structure and their alternative forms of knowledge representation, inference mechanisms and developer end-user interfaces. Next, functional capabilities, such as diagnosis and design, are related to alternative forms of mechanization. The characteristics and capabilities of existing commercial tools are then reviewed in terms of these criteria.

  6. Overview of Vertical Axis Wind Turbine (VAWT)

    NASA Technical Reports Server (NTRS)

    Sullivan, W. N.

    1979-01-01

    A survey is presented of the practices which were applied for designing VAWT blades. An attempt is made to discuss strengths and weaknesses of the existing procedures. Discussion is provided on planned or suggested future work in developing improved design tools.

  7. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  8. Multidisciplinary Design, Analysis, and Optimization Tool Development using a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley

    2008-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space A dministration Dryden Flight Research Center to automate analysis and design process by leveraging existing tools such as NASTRAN, ZAERO a nd CFD codes to enable true multidisciplinary optimization in the pr eliminary design stage of subsonic, transonic, supersonic, and hypers onic aircraft. This is a promising technology, but faces many challe nges in large-scale, real-world application. This paper describes cur rent approaches, recent results, and challenges for MDAO as demonstr ated by our experience with the Ikhana fire pod design.

  9. Total System Design (TSD) Methodology Assessment.

    DTIC Science & Technology

    1983-01-01

    hardware implementation. Author: Martin - Marietta Aerospace Title: Total System Design Methodology Source: Martin - Marietta Technical Report MCR -79-646...systematic, rational approach to computer systems design is needed. Martin - Marietta has produced a Total System Design Methodology to support such design...gathering and ordering. The purpose of the paper is to document the existing TSD methoeology at Martin - Marietta , describe the supporting tools, and

  10. Pyviko: an automated Python tool to design gene knockouts in complex viruses with overlapping genes.

    PubMed

    Taylor, Louis J; Strebel, Klaus

    2017-01-07

    Gene knockouts are a common tool used to study gene function in various organisms. However, designing gene knockouts is complicated in viruses, which frequently contain sequences that code for multiple overlapping genes. Designing mutants that can be traced by the creation of new or elimination of existing restriction sites further compounds the difficulty in experimental design of knockouts of overlapping genes. While software is available to rapidly identify restriction sites in a given nucleotide sequence, no existing software addresses experimental design of mutations involving multiple overlapping amino acid sequences in generating gene knockouts. Pyviko performed well on a test set of over 240,000 gene pairs collected from viral genomes deposited in the National Center for Biotechnology Information Nucleotide database, identifying a point mutation which added a premature stop codon within the first 20 codons of the target gene in 93.2% of all tested gene-overprinted gene pairs. This shows that Pyviko can be used successfully in a wide variety of contexts to facilitate the molecular cloning and study of viral overprinted genes. Pyviko is an extensible and intuitive Python tool for designing knockouts of overlapping genes. Freely available as both a Python package and a web-based interface ( http://louiejtaylor.github.io/pyViKO/ ), Pyviko simplifies the experimental design of gene knockouts in complex viruses with overlapping genes.

  11. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  12. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  13. A thermal sensation prediction tool for use by the profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fountain, M.E.; Huizenga, C.

    1997-12-31

    As part of a recent ASHRAE research project (781-RP), a thermal sensation prediction tool has been developed. This paper introduces the tool, describes the component thermal sensation models, and presents examples of how the tool can be used in practice. Since the main end product of the HVAC industry is the comfort of occupants indoors, tools for predicting occupant thermal response can be an important asset to designers of indoor climate control systems. The software tool presented in this paper incorporates several existing models for predicting occupant comfort.

  14. pgRNAFinder: a web-based tool to design distance independent paired-gRNA.

    PubMed

    Xiong, Yuanyan; Xie, Xiaowei; Wang, Yanzhi; Ma, Wenbing; Liang, Puping; Songyang, Zhou; Dai, Zhiming

    2017-11-15

    The CRISPR/Cas System has been shown to be an efficient and accurate genome-editing technique. There exist a number of tools to design the guide RNA sequences and predict potential off-target sites. However, most of the existing computational tools on gRNA design are restricted to small deletions. To address this issue, we present pgRNAFinder, with an easy-to-use web interface, which enables researchers to design single or distance-free paired-gRNA sequences. The web interface of pgRNAFinder contains both gRNA search and scoring system. After users input query sequences, it searches gRNA by 3' protospacer-adjacent motif (PAM), and possible off-targets, and scores the conservation of the deleted sequences rapidly. Filters can be applied to identify high-quality CRISPR sites. PgRNAFinder offers gRNA design functionality for 8 vertebrate genomes. Furthermore, to keep pgRNAFinder open, extensible to any organism, we provide the source package for local use. The pgRNAFinder is freely available at http://songyanglab.sysu.edu.cn/wangwebs/pgRNAFinder/, and the source code and user manual can be obtained from https://github.com/xiexiaowei/pgRNAFinder. songyang@bcm.edu or daizhim@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  16. Malicious Activity Simulation Tool (MAST) and Trust

    DTIC Science & Technology

    2015-06-01

    application through discovery and remediation of flaws. B. DESIGN AND DEVELOPMENT CONSIDERATIONS Design and development focuses on the actual...protection of the backup and restoration of the application. COBR -1 X V-16846 The IAO will ensure a disaster recovery plan exists in accordance

  17. Proceedings - International Conference on Wheel/Rail Load and Displacement Measurement Techniques : January 19-20, 1981

    DOT National Transportation Integrated Search

    1981-09-01

    Measurement of wheel/rail characteristics generates information for improvement of design tools such as model validation, establishment of load spectra and vehicle/track system interaction. Existing and new designs are assessed from evaluation of veh...

  18. ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Denney, Ewen

    2006-01-01

    Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.

  19. Realizing the Potential of Patient Engagement: Designing IT to Support Health in Everyday Life

    PubMed Central

    Novak, Laurie L.; Unertl, Kim M.; Holden, Richard J.

    2017-01-01

    Maintaining health or managing a chronic condition involves performing and coordinating potentially new and complex tasks in the context of everyday life. Tools such as reminder apps and online health communities are being created to support patients in carrying out these tasks. Research has documented mixed effectiveness and problems with continued use of these tools, and suggests that more widespread adoption may be aided by design approaches that facilitate integration of eHealth technologies into patients’ and family members’ daily routines. Given the need to augment existing methods of design and implementation of eHealth tools, this contribution discusses frameworks and associated methods that engage patients and explore contexts of use in ways that can produce insights for eHealth designers. PMID:27198106

  20. Designing Philadelphia Land Science as a Game to Promote Identity Exploration

    ERIC Educational Resources Information Center

    Barany, Amanda; Shah, Mamta; Cellitti, Jessica; Duka, Migela; Swiecki, Zachari; Evenstone, Amanda; Kinley, Hannah; Quigley, Peter; Shaffer, David Williamson; Foster, Aroutis

    2017-01-01

    Few digital tools are designed to support identity exploration around careers in science, technology, engineering, and mathematics (STEM) that may help close existing representation gaps in STEM fields. The aim of this project is to inform the design of games that facilitate learning as identity change as defined by the Projective Reflection…

  1. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Topological Mechanics of Origami and Kirigami

    NASA Astrophysics Data System (ADS)

    Chen, Bryan Gin-ge; Liu, Bin; Evans, Arthur A.; Paulose, Jayson; Cohen, Itai; Vitelli, Vincenzo; Santangelo, C. D.

    2016-04-01

    Origami and kirigami have emerged as potential tools for the design of mechanical metamaterials whose properties such as curvature, Poisson ratio, and existence of metastable states can be tuned using purely geometric criteria. A major obstacle to exploiting this property is the scarcity of tools to identify and program the flexibility of fold patterns. We exploit a recent connection between spring networks and quantum topological states to design origami with localized folding motions at boundaries and study them both experimentally and theoretically. These folding motions exist due to an underlying topological invariant rather than a local imbalance between constraints and degrees of freedom. We give a simple example of a quasi-1D folding pattern that realizes such topological states. We also demonstrate how to generalize these topological design principles to two dimensions. A striking consequence is that a domain wall between two topologically distinct, mechanically rigid structures is deformable even when constraints locally match the degrees of freedom.

  3. Does Copying Idioms Promote Their Recall?

    ERIC Educational Resources Information Center

    Stengers, Hélène; Deconinck, Julie; Boers, Frank; Eyckmans, June

    2016-01-01

    This paper reports an experiment designed to evaluate an attempt to improve the effectiveness of an existing L2 idiom-learning tool. In this tool, learners are helped to associate the abstract, idiomatic meaning of expressions such as "jump the gun" (act too soon) with their original, concrete meaning (e.g. associating "jump the…

  4. Practical Tools for Designing and Weighting Survey Samples

    ERIC Educational Resources Information Center

    Valliant, Richard; Dever, Jill A.; Kreuter, Frauke

    2013-01-01

    Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least…

  5. VideoANT: Extending Online Video Annotation beyond Content Delivery

    ERIC Educational Resources Information Center

    Hosack, Bradford

    2010-01-01

    This paper expands the boundaries of video annotation in education by outlining the need for extended interaction in online video use, identifying the challenges faced by existing video annotation tools, and introducing Video-ANT, a tool designed to create text-based annotations integrated within the time line of a video hosted online. Several…

  6. A historical perspective of VR water management for improved crop production

    USDA-ARS?s Scientific Manuscript database

    Variable-rate water management, or the combination of precision agriculture technology and irrigation, has been enabled by many of the same technologies as other precision agriculture tools. However, adding variable-rate capability to existing irrigation equipment design, or designing new equipment ...

  7. Learning Design Rashomon I--Supporting the Design of One Lesson through Different Approaches

    ERIC Educational Resources Information Center

    Persico, Donatella; Pozzi, Francesca; Anastopoulou, Stamatina; Conole, Grainne; Craft, Brock; Dimitriadis, Yannis; Hernandez-Leo, Davinia; Kali, Yael; Mor, Yishay; Perez-Sanagustin, Mar; Walmsley, Helen

    2013-01-01

    This paper presents and compares a variety of approaches that have been developed to guide the decision-making process in learning design. Together with the companion Learning Design Rashomon II (Prieto "et al.," 2013), devoted to existing tools to support the same process, it aims to provide a view on relevant research results in this…

  8. Research by Design: Design-Based Research and the Higher Degree Research student

    ERIC Educational Resources Information Center

    Kennedy-Clark, Shannon

    2013-01-01

    Design-based research lends itself to educational research as the aim of this approach is to develop and refine the design of artefacts, tools and curriculum and to advance existing theory or develop new theories that can support and lead to a deepened understanding of learning. This paper provides an overview of the potential benefits of using a…

  9. Guidance, navigation, and control subsystem equipment selection algorithm using expert system methods

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1991-01-01

    Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.

  10. Parameterizable Library Components for SAW Devices

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2006-01-01

    To facilitate quick fabrication of Surface Acoustic Wave (SAW) sensors we have found it necessary to develop a library of parameterizable components. This library is the first module in our strategy towards a design tool that is integrated into existing Electronic Design Automation (EDA) tools. This library is similar to the standard cell libraries found in digital design packages. The library cells allow the user to input the design parameters which automatically generate a detailed layout of the SAW component. This paper presents the results of our development of parameterizable cells for an InterDigitated Transducer (IDT), reflector, SAW delay line, and both one and two port resonators.

  11. FluoRender: joint freehand segmentation and visualization for many-channel fluorescence data analysis.

    PubMed

    Wan, Yong; Otsuna, Hideo; Holman, Holly A; Bagley, Brig; Ito, Masayoshi; Lewis, A Kelsey; Colasanto, Mary; Kardon, Gabrielle; Ito, Kei; Hansen, Charles

    2017-05-26

    Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations. Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender. The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.

  12. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  13. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  14. Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool

    NASA Astrophysics Data System (ADS)

    Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa

    2013-05-01

    Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.

  15. ORAC-DR: Pipelining With Other People's Code

    NASA Astrophysics Data System (ADS)

    Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy

    As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.

  16. Aeroservoelastic Modeling of Body Freedom Flutter for Control System Design

    NASA Technical Reports Server (NTRS)

    Ouellette, Jeffrey

    2017-01-01

    One of the most severe forms of coupling between aeroelasticity and flight dynamics is an instability called freedom flutter. The existing tools often assume relatively weak coupling, and are therefore unable to accurately model body freedom flutter. Because the existing tools were developed from traditional flutter analysis models, inconsistencies in the final models are not compatible with control system design tools. To resolve these issues, a number of small, but significant changes have been made to the existing approaches. A frequency domain transformation is used with the unsteady aerodynamics to ensure a more physically consistent stability axis rational function approximation of the unsteady aerodynamic model. The aerodynamic model is augmented with additional terms to account for limitations of the baseline unsteady aerodynamic model and to account for the gravity forces. An assumed modes method is used for the structural model to ensure a consistent definition of the aircraft states across the flight envelope. The X-56A stiff wing flight-test data were used to validate the current modeling approach. The flight-test data does not show body-freedom flutter, but does show coupling between the flight dynamics and the aeroelastic dynamics and the effects of the fuel weight.

  17. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  18. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.

  19. A Standalone Vision Impairments Simulator for Java Swing Applications

    NASA Astrophysics Data System (ADS)

    Oikonomou, Theofanis; Votis, Konstantinos; Korn, Peter; Tzovaras, Dimitrios; Likothanasis, Spriridon

    A lot of work has been done lately in an attempt to assess accessibility. For the case of web rich-client applications several tools exist that simulate how a vision impaired or colour-blind person would perceive this content. In this work we propose a simulation tool for non-web JavaTM Swing applications. Developers and designers face a real challenge when creating software that has to cope with a lot of interaction situations, as well as specific directives for ensuring an accessible interaction. The proposed standalone tool will assist them to explore user-centered design and important accessibility issues for their JavaTM Swing implementations.

  20. An exertional heat illness triage tool for a jungle training environment.

    PubMed

    Smith, Mike; Withnall, R; Boulter, M

    2017-09-06

    This article introduces a practical triage tool designed to assist commanders, jungle training instructors (JTIs) and medical personnel to identify Defence Personnel (DP) with suspected exertional heat illness (EHI). The challenges of managing suspected EHI in a jungle training environment and the potential advantages to stratifying the urgency of evacuation are discussed. This tool has been designed to be an adjunct to the existing MOD mandated heat illness recognition and first aid training. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Conceptual design of single turbofan engine powered light aircraft

    NASA Technical Reports Server (NTRS)

    Snyder, F. S.; Voorhees, C. G.; Heinrich, A. M.; Baisden, D. N.

    1977-01-01

    The conceptual design of a four place single turbofan engine powered light aircraft was accomplished utilizing contemporary light aircraft conventional design techniques as a means of evaluating the NASA-Ames General Aviation Synthesis Program (GASP) as a preliminary design tool. In certain areas, disagreement or exclusion were found to exist between the results of the conventional design and GASP processes. Detail discussion of these points along with the associated contemporary design methodology are presented.

  2. District Self-Assessment Tool: For Modification of Collective Bargaining Agreements to Achieve Flexible Conditions for School Turnaround

    ERIC Educational Resources Information Center

    Mass Insight Education (NJ1), 2011

    2011-01-01

    The District Self-Assessment Tool is designed to support districts, unions and Lead Partners, when analyzing their existing collective bargaining agreement (CBA) with the intention of making targeted modifications to support the implementation of dramatic reform in the district's lowest-performing schools. By outlining objectives and suggested…

  3. A System for Assessing Vulnerability of Species (SAVS) to Climate Change

    Treesearch

    Karen E. Bagne; Megan M. Friggens; Deborah M. Finch

    2011-01-01

    Sustained conservation of species requires integration of future climate change effects, but few tools exist to assist managers. The System for Assessing Vulnerability of Species (SAVS) identifies the relative vulnerability or resilience of vertebrate species to climate change. Designed for managers, the SAVS is an easily applied tool that uses a questionnaire of 22...

  4. Innovated Conceptual Design of Loading Unloading Tool for Livestock at the Port

    NASA Astrophysics Data System (ADS)

    Mustakim, Achmad; Hadi, Firmanto

    2018-03-01

    The condition of loading and unloading process of livestock in a number of Indonesian ports doesn’t meet the principle of animal welfare, which makes cattle lose weight and injury when unloaded. Livestock loading and unloading is done by throwing cattle into the sea one by one, tying cattle hung with a sling strap and push the cattle to the berth directly. This process is against PP. 82 year 2000 on Article 47 and 55 about animal welfare. Innovation of loading and unloading tools design offered are loading and unloading design with garbarata. In the design of loading and unloading tools with garbarata, apply the concept of semi-horizontal hydraulic ladder that connects the ship and truck directly. This livestock unloading equipment design innovation is a combination of fire extinguisher truck design and bridge equipped with weightlifting equipment. In 10 years of planning garbarata, requires a total cost of IDR 321,142,921; gets benefits IDR 923,352,333; and BCR (Benefit-Cost Ratio) Value worth 2.88. BCR value >1 means the tool is feasible applied. The designs of this loading and unloading tools are estimated up to 1 hour faster than existing way. It can also minimize risks such as injury and also weight reduction livestock agencies significantly.

  5. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London

    PubMed Central

    Lennox, Laura; Doyle, Cathal; Reed, Julie E

    2017-01-01

    Objectives Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Design Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. Setting National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). Participants CLAHRC NWL improvement initiative teams and staff. Results The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. Conclusion The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with the method over time. PMID:28947436

  6. TAxonomy of Self-reported Sedentary behaviour Tools (TASST) framework for development, comparison and evaluation of self-report tools: content analysis and systematic review

    PubMed Central

    Dall, PM; Coulter, EH; Fitzsimons, CF; Skelton, DA; Chastin, SFM

    2017-01-01

    Objective Sedentary behaviour (SB) has distinct deleterious health outcomes, yet there is no consensus on best practice for measurement. This study aimed to identify the optimal self-report tool for population surveillance of SB, using a systematic framework. Design A framework, TAxonomy of Self-reported Sedentary behaviour Tools (TASST), consisting of four domains (type of assessment, recall period, temporal unit and assessment period), was developed based on a systematic inventory of existing tools. The inventory was achieved through a systematic review of studies reporting SB and tracing back to the original description. A systematic review of the accuracy and sensitivity to change of these tools was then mapped against TASST domains. Data sources Systematic searches were conducted via EBSCO, reference lists and expert opinion. Eligibility criteria for selecting studies The inventory included tools measuring SB in adults that could be self-completed at one sitting, and excluded tools measuring SB in specific populations or contexts. The systematic review included studies reporting on the accuracy against an objective measure of SB and/or sensitivity to change of a tool in the inventory. Results The systematic review initially identified 32 distinct tools (141 questions), which were used to develop the TASST framework. Twenty-two studies evaluated accuracy and/or sensitivity to change representing only eight taxa. Assessing SB as a sum of behaviours and using a previous day recall were the most promising features of existing tools. Accuracy was poor for all existing tools, with underestimation and overestimation of SB. There was a lack of evidence about sensitivity to change. Conclusions Despite the limited evidence, mapping existing SB tools onto the TASST framework has enabled informed recommendations to be made about the most promising features for a surveillance tool, identified aspects on which future research and development of SB surveillance tools should focus. Trial registration number International prospective register of systematic reviews (PROPSPERO)/CRD42014009851. PMID:28391233

  7. ART/Ada and CLIPS/Ada

    NASA Technical Reports Server (NTRS)

    Culbert, Chris

    1990-01-01

    Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA. This tool will provide full syntactic compatibility with any eventual products of the ART/Ada design while allowing SSFP developers early access to this technology.

  8. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  9. MACHETE: Environment for Space Networking Evaluation

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.; Woo, Simon

    2010-01-01

    Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.

  10. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    NASA Technical Reports Server (NTRS)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  11. Analysis of a mammography teaching program based on an affordance design model.

    PubMed

    Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei

    2006-12-01

    The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.

  12. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  14. Development of the Aboriginal Communication Assessment After Brain Injury (ACAABI): A screening tool for identifying acquired communication disorders in Aboriginal Australians.

    PubMed

    Armstrong, Elizabeth M; Ciccone, Natalie; Hersh, Deborah; Katzenellebogen, Judith; Coffin, Juli; Thompson, Sandra; Flicker, Leon; Hayward, Colleen; Woods, Deborah; McAllister, Meaghan

    2017-06-01

    Acquired communication disorders (ACD), following stroke and traumatic brain injury, may not be correctly identified in Aboriginal Australians due to a lack of linguistically and culturally appropriate assessment tools. Within this paper we explore key issues that were considered in the development of the Aboriginal Communication Assessment After Brain Injury (ACAABI) - a screening tool designed to assess the presence of ACD in Aboriginal populations. A literature review and consultation with key stakeholders were undertaken to explore directions needed to develop a new tool, based on existing tools and recommendations for future developments. The literature searches revealed no existing screening tool for ACD in these populations, but identified tools in the areas of cognition and social-emotional wellbeing. Articles retrieved described details of the content and style of these tools, with recommendations for the development and administration of a new tool. The findings from the interview and focus group views were consistent with the approach recommended in the literature. There is a need for a screening tool for ACD to be developed but any tool must be informed by knowledge of Aboriginal language, culture and community input in order to be acceptable and valid.

  15. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  16. Vaccination adherence: Review and proposed model.

    PubMed

    Abahussin, Asma A; Albarrak, Ahmed I

    The prevalence of childhood vaccine-preventable diseases can be significantly reduced through adherence to confirmed vaccination schedules. However, many barriers to vaccination compliance exist, including a lack of awareness regarding the importance of vaccines, missing due dates, and fear of complications from vaccinations. The aim of this study is to review the existing tools and publications regarding vaccination adherence, and to propose a design for a vaccination adherence application (app) for smartphones. Android and iOS apps designed for vaccination reminders have been reviewed to examine six elements: educational factor; customizing features; reminder tools; peer education facilitations; feedback, and the language of apps' interface and content. The literature from PubMed has been reviewed for studies addressing reminder systems or tools including apps. The study has revealed insufficient (n=6) technology-based interventions for increasing childhood vaccination rates by reminding parents in comparison to the fast growth in technology, out of which are two publications discussed mobile apps. Ten apps have been found in apps stores; only one out of them was designed for the Saudi vaccination schedule in Arabic language with some weaknesses. The study proposed a design for a vaccination reminder app that includes a number of features in order to overcome the limitations discussed in the studied reminders, apps, and systems. The design supports the Arabic language and the Saudi vaccination schedule; parental education including peer education; a variety of reminder methods, and the capability to track vaccinations and refer to the app as a personal health record. The study discussed a design for a vaccination reminder app that satisfies the specific requirements for better compliance to children's immunization schedules based on reviewing the existing apps and publications. The proposed design includes element to educate parents and answer their concerns about vaccines. It involves their peers and can encourage the exchange of experiences and overcome vaccine fears. In addition, it could form a convenient child personal health record. Copyright © 2016. Published by Elsevier Ltd.

  17. A tool to convert CAD models for importation into Geant4

    NASA Astrophysics Data System (ADS)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  18. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  19. FireProt: web server for automated design of thermostable proteins

    PubMed Central

    Musil, Milos; Stourac, Jan; Brezovsky, Jan; Prokop, Zbynek; Zendulka, Jaroslav; Martinek, Tomas

    2017-01-01

    Abstract There is a continuous interest in increasing proteins stability to enhance their usability in numerous biomedical and biotechnological applications. A number of in silico tools for the prediction of the effect of mutations on protein stability have been developed recently. However, only single-point mutations with a small effect on protein stability are typically predicted with the existing tools and have to be followed by laborious protein expression, purification, and characterization. Here, we present FireProt, a web server for the automated design of multiple-point thermostable mutant proteins that combines structural and evolutionary information in its calculation core. FireProt utilizes sixteen tools and three protein engineering strategies for making reliable protein designs. The server is complemented with interactive, easy-to-use interface that allows users to directly analyze and optionally modify designed thermostable mutants. FireProt is freely available at http://loschmidt.chemi.muni.cz/fireprot. PMID:28449074

  20. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  1. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  2. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  3. hydropower biological evaluation tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software is a set of analytical tools to evaluate the physical and biological performance of existing, refurbished, or newly installed conventional hydro-turbines nationwide where fish passage is a regulatory concern. The current version is based on information collected by the Sensor Fish. Future version will include other technologies. The tool set includes data acquisition, data processing, and biological response tools with applications to various turbine designs and other passage alternatives. The associated database is centralized, and can be accessed remotely. We have demonstrated its use for various applications including both turbines and spillways

  4. Coordinator's Guide for Indoor Air Quality

    EPA Pesticide Factsheets

    IAQ Tools for Schools Action Kit - IAQ Coordinator's Guide. This guidance is designed to present practical and often low-cost actions you can take to identify and address existing or potential air quality problems.

  5. The development and validation of a meta-tool for quality appraisal of public health evidence: Meta Quality Appraisal Tool (MetaQAT).

    PubMed

    Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V

    2016-07-01

    Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. A Dynamic Social Feedback System to Support Learning and Social Interaction in Higher Education

    ERIC Educational Resources Information Center

    Thoms, Brian

    2011-01-01

    In this research, we examine the design, construction, and implementation of a dynamic, easy to use, feedback mechanism for social software. The tool was integrated into an existing university's online learning community (OLC). In line with constructivist learning models and practical information systems (IS) design, the feedback system provides…

  7. Elements, Principles, and Critical Inquiry for Identity-Centered Design of Online Environments

    ERIC Educational Resources Information Center

    Dudek, Jaclyn; Heiser, Rebecca

    2017-01-01

    Within higher education, a need exists for learning designs that facilitate education and support students in sharing, examining, and refining their critical identities as learners and professionals. In the past, technology-mediated identity work has focused on individual tool use or a learning setting. However, we as professional learning…

  8. Design Criteria for Visual Cues Used in Disruptive Learning Interventions within Sustainability Education

    ERIC Educational Resources Information Center

    Tillmanns, Tanja; Holland, Charlotte; Filho, Alfredo Salomão

    2017-01-01

    This paper presents the design criteria for Visual Cues--visual stimuli that are used in combination with other pedagogical processes and tools in Disruptive Learning interventions in sustainability education--to disrupt learners' existing frames of mind and help re-orient learners' mind-sets towards sustainability. The theory of Disruptive…

  9. A Digital Library for Education: The PEN-DOR Project.

    ERIC Educational Resources Information Center

    Fullerton, Karen; Greenberg, Jane; McClure, Maureen; Rasmussen, Edie; Stewart, Darin

    1999-01-01

    Describes Pen-DOR (Pennsylvania Education Network Digital Object Repository), a digital library designed to provide K-12 educators with access to multimedia resources and tools to create new lesson plans and modify existing ones via the World Wide Web. Discusses design problems of a distributed, object-oriented database architecture and describes…

  10. Video Modeling for Children and Adolescents with Autism Spectrum Disorder: A Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Teresa Lynn

    2014-01-01

    The objective of this research was to conduct a meta-analysis to examine existing research studies on video modeling as an effective teaching tool for children and adolescents diagnosed with Autism Spectrum Disorder (ASD). Study eligibility criteria included (a) single case research design using multiple baselines, alternating treatment designs,…

  11. Next-Generation NATO Reference Mobility Model (NG-NRMM)

    DTIC Science & Technology

    2016-05-11

    facilitate comparisons between vehicle design candidates and to assess the mobility of existing vehicles under specific scenarios. Although NRMM has...of different deployed platforms in different areas of operation and routes  Improved flexibility as a design and procurement support tool through...Element Method DEM Digital Elevation Model DIL Driver in the Loop DP Drawbar Pull Force DOE Design of Experiments DTED Digital Terrain Elevation Data

  12. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  13. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  14. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  15. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  16. Reference Guide for Indoor Air Quality in Schools

    EPA Pesticide Factsheets

    IAQ Tools for Schools Action Kit - IAQ Reference Guide. This guidance is designed to present practical and often low-cost actions you can take to identify and address existing or potential air quality problems.

  17. SketchBio: a scientist's 3D interface for molecular modeling and animation.

    PubMed

    Waldon, Shawn M; Thompson, Peter M; Hahn, Patrick J; Taylor, Russell M

    2014-10-30

    Because of the difficulties involved in learning and using 3D modeling and rendering software, many scientists hire programmers or animators to create models and animations. This both slows the discovery process and provides opportunities for miscommunication. Working with multiple collaborators, a tool was developed (based on a set of design goals) to enable them to directly construct models and animations. SketchBio is presented, a tool that incorporates state-of-the-art bimanual interaction and drop shadows to enable rapid construction of molecular structures and animations. It includes three novel features: crystal-by-example, pose-mode physics, and spring-based layout that accelerate operations common in the formation of molecular models. Design decisions and their consequences are presented, including cases where iterative design was required to produce effective approaches. The design decisions, novel features, and inclusion of state-of-the-art techniques enabled SketchBio to meet all of its design goals. These features and decisions can be incorporated into existing and new tools to improve their effectiveness.

  18. Cost of enlarged operating zone for an existing Francis runner

    NASA Astrophysics Data System (ADS)

    Monette, Christine; Marmont, Hugues; Chamberland-Lauzon, Joël; Skagerstrand, Anders; Coutu, André; Carlevi, Jens

    2016-11-01

    Traditionally, hydro power plants have been operated close to best efficiency point, the more stable operating condition for which they have been designed. However, because of changes in the electricity market, many hydro power plants operators wish to operate their machines differently to fulfil those new market needs. New operating conditions can include whole range operation, many start/stops, extensive low load operation, synchronous condenser mode and power/frequency regulation. Many of these new operating conditions may impose more severe fatigue damage than the traditional base load operation close to best efficiency point. Under these conditions, the fatigue life of the runner may be significantly reduced and reparation or replacement cost might occur sooner than expected. In order to design reliable Francis runners for those new challenging operating scenarios, Andritz Hydro has developed various proprietary tools and design rules. These are used within Andritz Hydro to design mechanically robust Francis runners for the operating scenarios fulfilling customer's specifications. To estimate residual life under different operating scenarios of an existing runner designed years ago for best efficiency base load operation, Andritz Hydro's design rules and tools would necessarily lead to conservative results. While the geometry of a new runner can be modified to fulfil all conservative mechanical design rules, the predicted fatigue life of an existing runner under off-design operating conditions may appear rather short because of the conservative safety factor included in the calculations. The most precise and reliable way to calculate residual life of an existing runner under different operating scenarios is to perform a strain gauge measurement campaign on the runner. This paper presents the runner strain gage measurement campaign of a mid-head Francis turbine over all the operating conditions available during the test, the analysis of the measurement signals and the runner residual life assessment under different operating scenarios. With these results, the maintenance cost of the change in operating mode can then be calculated and foreseen by the power plant owner.

  19. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  20. Experiences in Digital Circuit Design Courses: A Self-Study Platform for Learning Support

    ERIC Educational Resources Information Center

    Bañeres, David; Clarisó, Robert; Jorba, Josep; Serra, Montse

    2014-01-01

    The synthesis of digital circuits is a basic skill in all the bachelor programmes around the ICT area of knowledge, such as Computer Science, Telecommunication Engineering or Electrical Engineering. An important hindrance in the learning process of this skill is that the existing educational tools for the design of circuits do not allow the…

  1. Evolution of a Model for Socio-Scientific Issue Teaching and Learning

    ERIC Educational Resources Information Center

    Sadler, Troy D.; Foulk, Jaimie A.; Friedrichsen, Patricia J.

    2017-01-01

    Socio-scientific teaching and learning (SSI-TL) has been suggested as an effective approach for supporting meaningful learning in school contexts; however, limited tools exist to support the work of designing and implementing this approach. In this paper, we draw from a series of four design based research projects that have produced SSI…

  2. The Effect of Prior Knowledge and Feedback Type Design on Student Achievement and Satisfaction in Introductory Accounting

    ERIC Educational Resources Information Center

    Campbell, Donald P.

    2013-01-01

    This study investigated the effect of student prior knowledge and feedback type on student achievement and satisfaction in an introductory managerial accounting course using computer-based formative assessment tools. The study involved a redesign of the existing Job Order Costing unit using the ADDIE model of instructional design. The…

  3. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  4. Rotorcraft Performance Model (RPM) for use in AEDT.

    DOT National Transportation Integrated Search

    2015-11-01

    This report documents a rotorcraft performance model for use in the FAAs Aviation Environmental Design Tool. The new rotorcraft performance model is physics-based. This new model replaces the existing helicopter trajectory modeling methods in the ...

  5. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  6. A portal to validated websites on cosmetic surgery: the design of an archetype.

    PubMed

    Parikh, A R; Kok, K; Redfern, B; Clarke, A; Withey, S; Butler, P E M

    2006-09-01

    There has recently been an increase in the usage of the Internet as a source of patient information. It is very difficult for laypersons to establish the accuracy and validity of these medical websites. Although many website assessment tools exist, most of these are not practical.A combination of consumer- and clinician-based website assessment tools was applied to 200 websites on cosmetic surgery. The top-scoring websites were used as links from a portal website that was designed using Microsoft Macromedia Suite.Seventy-one (35.5%) websites were excluded. One hundred fifteen websites (89%) failed to reach an acceptable standard.The provision of new websites has proceeded without quality controls. Patients need to be better educated on the limitations of the Internet. This paper suggests an archetypal model, which makes efficient use of existing resources, validates them, and is easily transferable to different health settings.

  7. ORAC: 21st Century Observing at UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.

    The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.

  8. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  9. Designing Contestability: Interaction Design, Machine Learning, and Mental Health

    PubMed Central

    Hirsch, Tad; Merced, Kritzia; Narayanan, Shrikanth; Imel, Zac E.; Atkins, David C.

    2017-01-01

    We describe the design of an automated assessment and training tool for psychotherapists to illustrate challenges with creating interactive machine learning (ML) systems, particularly in contexts where human life, livelihood, and wellbeing are at stake. We explore how existing theories of interaction design and machine learning apply to the psychotherapy context, and identify “contestability” as a new principle for designing systems that evaluate human behavior. Finally, we offer several strategies for making ML systems more accountable to human actors. PMID:28890949

  10. A system approach to aircraft optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1991-01-01

    Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. Techniques are outlined for computing these influences as system design derivatives useful for both judgemental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering organizations and incorporate their design tools.

  11. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  12. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  13. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  14. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  15. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London.

    PubMed

    Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek

    2017-09-24

    Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with the method over time. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. An introduction to human systems integration (HSI) in the U.S. railroad industry.

    DOT National Transportation Integrated Search

    2007-04-01

    Human systems integration (HSI) is a systematic, organization-wide approach to : implementing new technologies and modernizing existing systems. It is a combination of : managerial philosophy, methods, techniques, and tools designed to emphasize, dur...

  17. Trajectory-Based Takeoff Time Predictions Applied to Tactical Departure Scheduling: Concept Description, System Design, and Initial Observations

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn A.; Capps, Alan

    2011-01-01

    Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.

  18. Auto-Calibration and Fault Detection and Isolation of Skewed Redundant Accelerometers in Measurement While Drilling Systems.

    PubMed

    Seyed Moosavi, Seyed Mohsen; Moaveni, Bijan; Moshiri, Behzad; Arvan, Mohammad Reza

    2018-02-27

    The present study designed skewed redundant accelerometers for a Measurement While Drilling (MWD) tool and executed auto-calibration, fault diagnosis and isolation of accelerometers in this tool. The optimal structure includes four accelerometers was selected and designed precisely in accordance with the physical shape of the existing MWD tool. A new four-accelerometer structure was designed, implemented and installed on the current system, replacing the conventional orthogonal structure. Auto-calibration operation of skewed redundant accelerometers and all combinations of three accelerometers have been done. Consequently, biases, scale factors, and misalignment factors of accelerometers have been successfully estimated. By defecting the sensors in the new optimal skewed redundant structure, the fault was detected using the proposed FDI method and the faulty sensor was diagnosed and isolated. The results indicate that the system can continue to operate with at least three correct sensors.

  19. Auto-Calibration and Fault Detection and Isolation of Skewed Redundant Accelerometers in Measurement While Drilling Systems

    PubMed Central

    Seyed Moosavi, Seyed Mohsen; Moshiri, Behzad; Arvan, Mohammad Reza

    2018-01-01

    The present study designed skewed redundant accelerometers for a Measurement While Drilling (MWD) tool and executed auto-calibration, fault diagnosis and isolation of accelerometers in this tool. The optimal structure includes four accelerometers was selected and designed precisely in accordance with the physical shape of the existing MWD tool. A new four-accelerometer structure was designed, implemented and installed on the current system, replacing the conventional orthogonal structure. Auto-calibration operation of skewed redundant accelerometers and all combinations of three accelerometers have been done. Consequently, biases, scale factors, and misalignment factors of accelerometers have been successfully estimated. By defecting the sensors in the new optimal skewed redundant structure, the fault was detected using the proposed FDI method and the faulty sensor was diagnosed and isolated. The results indicate that the system can continue to operate with at least three correct sensors. PMID:29495434

  20. Impact resistant boron/aluminum composites for large fan blades

    NASA Technical Reports Server (NTRS)

    Oller, T. L.; Salemme, C. T.; Bowden, J. H.; Doble, G. S.; Melnyk, P.

    1977-01-01

    Blade-like specimens were subjected to static ballistic impact testing to determine their relative FOD impact resistance levels. It was determined that a plus or minus 15 deg layup exhibited good impact resistance. The design of a large solid boron/aluminum fan blade was conducted based on the FOD test results. The CF6 fan blade was used as a baseline for these design studies. The solid boron/aluminum fan blade design was used to fabricate two blades. This effort enabled the assessment of the scale up of existing blade manufacturing details for the fabrication of a large B/Al fan blade. Existing CF6 fan blade tooling was modified for use in fabricating these blades.

  1. DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology

    PubMed Central

    Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng

    2015-01-01

    Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437

  2. Expanding the Design Space: Forging the Transition from 3D Printing to Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Amend, Matthew

    The synergy of Additive Manufacturing and Computational Geometry has the potential to radically expand the "design space" of solutions available to designers. Additive Manufacturing (AM) is capable of fabricating objects that are highly complex both in geometry and material properties. However, the introduction of any new technology can have a disruptive effect on established design practices and organizations. Before "Design for Additive Manufacturing" (DFAM) is a commonplace means of producing objects employed in "real world" products, appropriate design knowledge must be sufficiently integrated within industry. First, materials suited to additive manufacturing methods must be developed to satisfy existing industry standards and specifications, or new standards must be developed. Second, a new class of design representation (CAD) tools will need to be developed. Third, designers and design organizations will need to develop strategies for employing such tools. This thesis describes three DFAM exercises intended to demonstrate the potential for innovative design when using advanced additive materials, tools, and printers. These design exercises included 1) a light-weight composite layup mold developed with topology optimization, 2) a low-pressure fluid duct enhanced with an external lattice structure, and 3) an airline seat tray designed using a non-uniform lattice structure optimized with topology optimization.

  3. WASP: a Web-based Allele-Specific PCR assay designing tool for detecting SNPs and mutations

    PubMed Central

    Wangkumhang, Pongsakorn; Chaichoompu, Kridsadakorn; Ngamphiw, Chumpol; Ruangrit, Uttapong; Chanprasert, Juntima; Assawamakin, Anunchai; Tongsima, Sissades

    2007-01-01

    Background Allele-specific (AS) Polymerase Chain Reaction is a convenient and inexpensive method for genotyping Single Nucleotide Polymorphisms (SNPs) and mutations. It is applied in many recent studies including population genetics, molecular genetics and pharmacogenomics. Using known AS primer design tools to create primers leads to cumbersome process to inexperience users since information about SNP/mutation must be acquired from public databases prior to the design. Furthermore, most of these tools do not offer the mismatch enhancement to designed primers. The available web applications do not provide user-friendly graphical input interface and intuitive visualization of their primer results. Results This work presents a web-based AS primer design application called WASP. This tool can efficiently design AS primers for human SNPs as well as mutations. To assist scientists with collecting necessary information about target polymorphisms, this tool provides a local SNP database containing over 10 million SNPs of various populations from public domain databases, namely NCBI dbSNP, HapMap and JSNP respectively. This database is tightly integrated with the tool so that users can perform the design for existing SNPs without going off the site. To guarantee specificity of AS primers, the proposed system incorporates a primer specificity enhancement technique widely used in experiment protocol. In particular, WASP makes use of different destabilizing effects by introducing one deliberate 'mismatch' at the penultimate (second to last of the 3'-end) base of AS primers to improve the resulting AS primers. Furthermore, WASP offers graphical user interface through scalable vector graphic (SVG) draw that allow users to select SNPs and graphically visualize designed primers and their conditions. Conclusion WASP offers a tool for designing AS primers for both SNPs and mutations. By integrating the database for known SNPs (using gene ID or rs number), this tool facilitates the awkward process of getting flanking sequences and other related information from public SNP databases. It takes into account the underlying destabilizing effect to ensure the effectiveness of designed primers. With user-friendly SVG interface, WASP intuitively presents resulting designed primers, which assist users to export or to make further adjustment to the design. This software can be freely accessed at . PMID:17697334

  4. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single-stage-to-orbit (SSTO) reusable launch vehicle concept, developed by the NASA Langley Research Center under the Space Launch Initiative, provides the validation case for this work, with the focus placed on economics, aerothermodynamics, propulsion, and structures metrics. (Abstract shortened by UMI.)

  5. An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls

    NASA Technical Reports Server (NTRS)

    Walker, G. P.; Wagner, E. A.; Bodden, D. S.

    1996-01-01

    This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.

  6. Energy Conservation: A Management Report for State and Local Governments and A Technical Guide for State and Local Governments.

    ERIC Educational Resources Information Center

    Public Technology, Inc., Washington, DC.

    This technical guide is part of a packet of tools designed to assist state or local government practitioners in organizing and managing an energy conservation program. It gives information on adapting energy conservation methods to existing public buildings and on designing new public buildings with energy conservation in mind. It also discusses…

  7. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  8. Toward a mathematical formalism of performance, task difficulty, and activation

    NASA Technical Reports Server (NTRS)

    Samaras, George M.

    1988-01-01

    The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.

  9. Iterative user centered design for development of a patient-centered fall prevention toolkit.

    PubMed

    Katsulis, Zachary; Ergai, Awatef; Leung, Wai Yin; Schenkel, Laura; Rai, Amisha; Adelman, Jason; Benneyan, James; Bates, David W; Dykes, Patricia C

    2016-09-01

    Due to the large number of falls that occur in hospital settings, inpatient fall prevention is a topic of great interest to patients and health care providers. The use of electronic decision support that tailors fall prevention strategy to patient-specific risk factors, known as Fall T.I.P.S (Tailoring Interventions for Patient Safety), has proven to be an effective approach for decreasing hospital falls. A paper version of the Fall T.I.P.S toolkit was developed primarily for hospitals that do not have the resources to implement the electronic solution; however, more work is needed to optimize the effectiveness of the paper version of this tool. We examined the use of human factors techniques in the redesign of the existing paper fall prevention tool with the goal of increasing ease of use and decreasing inpatient falls. The inclusion of patients and clinical staff in the redesign of the existing tool was done to increase adoption of the tool and fall prevention best practices. The redesigned paper Fall T.I.P.S toolkit showcased a built in clinical decision support system and increased ease of use over the existing version. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Engagement and Empowerment Through Self-Service.

    PubMed

    Endriss, Jason

    2016-01-01

    Self-service tools represent the next frontier for leave and disability. This article discusses several critical com- ponents of a successful leave and disability self-service tool. If given the proper investment and thoughtfully designed, self-service tools have the potential to augment an organization's existing interaction channels, im- proving the employee experience while delivering efficiencies for an administrative model. In an operating en- vironment in which cost savings sometimes are at the expense of employee experience, such a win-win solution should not be taken lightly and, more importantly, should not be missed.

  11. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  12. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  13. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  14. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  15. Refining Measures for Assessing Problematic/Addictive Digital Gaming Use in Clinical and Research Settings.

    PubMed

    Faust, Kyle; Faust, David

    2015-08-12

    Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools.

  16. Refining Measures for Assessing Problematic/Addictive Digital Gaming Use in Clinical and Research Settings

    PubMed Central

    Faust, Kyle; Faust, David

    2015-01-01

    Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools. PMID:26274977

  17. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  18. Silicon compilation: From the circuit to the system

    NASA Astrophysics Data System (ADS)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  19. Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.

    PubMed

    Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd

    2015-09-28

    Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).

  20. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  1. Object-oriented design tools for supramolecular devices and biomedical nanotechnology.

    PubMed

    Lee, Stephen C; Bhalerao, Khaustaub; Ferrari, Mauro

    2004-05-01

    Nanotechnology provides multifunctional agents for in vivo use that increasingly blur the distinction between pharmaceuticals and medical devices. Realization of such therapeutic nanodevices requires multidisciplinary effort that is difficult for individual device developers to sustain, and identification of appropriate collaborations outside ones own field can itself be challenging. Further, as in vivo nanodevices become increasingly complex, their design will increasingly demand systems level thinking. System engineering tools such as object-oriented analysis, object-oriented design (OOA/D) and unified modeling language (UML) are applicable to nanodevices built from biological components, help logically manage the knowledge needed to design them, and help identify useful collaborative relationships for device designers. We demonstrate the utility of these systems engineering tools by reverse engineering an existing molecular device (the bacmid molecular cloning system) using them, and illustrate how object-oriented approaches identify fungible components (objects) in nanodevices in a way that facilitates design of families of related devices, rather than single inventions. We also explore the utility of object-oriented approaches for design of another class of therapeutic nanodevices, vaccines. While they are useful for design of current nanodevices, the power of systems design tools for biomedical nanotechnology will become increasingly apparent as the complexity and sophistication of in vivo nanosystems increases. The nested, hierarchical nature of object-oriented approaches allows treatment of devices as objects in higher-order structures, and so will facilitate concatenation of multiple devices into higher-order, higher-function nanosystems.

  2. ART/Ada design project, phase 1. Task 2 report: Detailed design

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.

  3. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    PubMed

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.

  4. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  5. DeMAID/GA USER'S GUIDE Design Manager's Aid for Intelligent Decomposition with a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1996-01-01

    Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool that is available to aid in this decision making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial release of DEMAID in 1989, numerous enhancements have been added to aid the design manager in saving both cost and time in a design cycle. The key enhancement is a genetic algorithm (GA) and the enhanced version is called DeMAID/GA. The GA orders the sequence of design processes to minimize the cost and time to converge to a solution. These enhancements as well as the existing features of the original version of DEMAID are described. Two sample problems are used to show how these enhancements can be applied to improve the design cycle. This report serves as a user's guide for DeMAID/GA.

  6. Friction Stir Welding of Large Scale Cryogenic Tanks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Russell, Carolyn; Ding, R. Jeffrey

    1998-01-01

    The Marshall Space Flight Center (MSFC) has established a facility for the joining of large-scale aluminum cryogenic propellant tanks using the friction stir welding process. Longitudinal welds, approximately five meters in length, have been made by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and travel system will be described in this presentation along with process controls and real-time data acquisition developed for this application. The approach to retrofitting other large welding tools at MSFC with the friction stir welding process will also be discussed.

  7. BiOSS: A system for biomedical ontology selection.

    PubMed

    Martínez-Romero, Marcos; Vázquez-Naya, José M; Pereira, Javier; Pazos, Alejandro

    2014-04-01

    In biomedical informatics, ontologies are considered a key technology for annotating, retrieving and sharing the huge volume of publicly available data. Due to the increasing amount, complexity and variety of existing biomedical ontologies, choosing the ones to be used in a semantic annotation problem or to design a specific application is a difficult task. As a consequence, the design of approaches and tools addressed to facilitate the selection of biomedical ontologies is becoming a priority. In this paper we present BiOSS, a novel system for the selection of biomedical ontologies. BiOSS evaluates the adequacy of an ontology to a given domain according to three different criteria: (1) the extent to which the ontology covers the domain; (2) the semantic richness of the ontology in the domain; (3) the popularity of the ontology in the biomedical community. BiOSS has been applied to 5 representative problems of ontology selection. It also has been compared to existing methods and tools. Results are promising and show the usefulness of BiOSS to solve real-world ontology selection problems. BiOSS is openly available both as a web tool and a web service. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. Techniques for designing rotorcraft control systems

    NASA Technical Reports Server (NTRS)

    Levine, William S.; Barlow, Jewel

    1993-01-01

    This report summarizes the work that was done on the project from 1 Apr. 1992 to 31 Mar. 1993. The main goal of this research is to develop a practical tool for rotorcraft control system design based on interactive optimization tools (CONSOL-OPTCAD) and classical rotorcraft design considerations (ADOCS). This approach enables the designer to combine engineering intuition and experience with parametric optimization. The combination should make it possible to produce a better design faster than would be possible using either pure optimization or pure intuition and experience. We emphasize that the goal of this project is not to develop an algorithm. It is to develop a tool. We want to keep the human designer in the design process to take advantage of his or her experience and creativity. The role of the computer is to perform the calculation necessary to improve and to display the performance of the nominal design. Briefly, during the first year we have connected CONSOL-OPTCAD, an existing software package for optimizing parameters with respect to multiple performance criteria, to a simplified nonlinear simulation of the UH-60 rotorcraft. We have also created mathematical approximations to the Mil-specs for rotorcraft handling qualities and input them into CONSOL-OPTCAD. Finally, we have developed the additional software necessary to use CONSOL-OPTCAD for the design of rotorcraft controllers.

  9. On-line Naval Engineering Skills Supplemental Training Program

    DTIC Science & Technology

    2010-01-01

    Defense Technical University ( DTU ), the technical content for courses would have to be provided by the Naval technical authorities...of technological knowledge related to design engineering such as the DTU , or expanded within the mission scope of an existing organization such as...management program as a training tool for naval design engineers such as the DTU or a technical extension of the DAU program for acquisition training

  10. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  11. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  12. The acoustic performance of double-skin facades: A design support tool for architects

    NASA Astrophysics Data System (ADS)

    Batungbakal, Aireen

    This study assesses and validates the influence of measuring sound in the urban environment and the influence of glass facade components in reducing sound transmission to the indoor environment. Among the most reported issues affecting workspaces, increased awareness to minimize noise led building designers to reconsider the design of building envelopes and its site environment. Outdoor sound conditions, such as traffic noise, challenge designers to accurately estimate the capability of glass facades in acquiring an appropriate indoor sound quality. Indicating the density of the urban environment, field-tests acquired existing sound levels in areas of high commercial development, employment, and traffic activity, establishing a baseline for sound levels common in urban work areas. Composed from the direct sound transmission loss of glass facades simulated through INSUL, a sound insulation software, data is utilized as an informative tool correlating the response of glass facade components towards existing outdoor sound levels of a project site in order to achieve desired indoor sound levels. This study progresses to link the disconnection in validating the acoustic performance of glass facades early in a project's design, from conditioned settings such as field-testing and simulations to project completion. Results obtained from the study's facade simulations and facade comparison supports that acoustic comfort is not limited to a singular solution, but multiple design options responsive to its environment.

  13. Appraisal of comparative single-case experimental designs for instructional interventions with non-reversible target behaviors: Introducing the CSCEDARS ("Cedars").

    PubMed

    Schlosser, Ralf W; Belfiore, Phillip J; Sigafoos, Jeff; Briesch, Amy M; Wendt, Oliver

    2018-05-28

    Evidence-based practice as a process requires the appraisal of research as a critical step. In the field of developmental disabilities, single-case experimental designs (SCEDs) figure prominently as a means for evaluating the effectiveness of non-reversible instructional interventions. Comparative SCEDs contrast two or more instructional interventions to document their relative effectiveness and efficiency. As such, these designs have great potential to inform evidence-based decision-making. To harness this potential, however, interventionists and authors of systematic reviews need tools to appraise the evidence generated by these designs. Our literature review revealed that existing tools do not adequately address the specific methodological considerations of comparative SCEDs that aim to compare instructional interventions of non-reversible target behaviors. The purpose of this paper is to introduce the Comparative Single-Case Experimental Design Rating System (CSCEDARS, "cedars") as a tool for appraising the internal validity of comparative SCEDs of two or more non-reversible instructional interventions. Pertinent literature will be reviewed to establish the need for this tool and to underpin the rationales for individual rating items. Initial reliability information will be provided as well. Finally, directions for instrument validation will be proposed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Science opportunity analyzer - a multi-mission tool for planning

    NASA Technical Reports Server (NTRS)

    Streiffert, B. A.; Polanskey, C. A.; O'Reilly, T.; Colwell, J.

    2002-01-01

    For many years the diverse scientific community that supports JPL's wide variety ofinterplanetary space missions has needed a tool in order to plan and develop their experiments. The tool needs to be easily adapted to various mission types and portable to the user community. The Science Opportunity Analyzer, SOA, now in its third year of development, is intended to meet this need. SOA is a java-based application that is designed to enable scientists to identify and analyze opportunities for science observations from spacecraft. It differs from other planning tools in that it does not require an in-depth knowledge of the spacecraft command system or operation modes to begin high level planning. Users can, however, develop increasingly detailed levels of design. SOA consists of six major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, Data Output and Communications. Opportunity Search is a GUI driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. The user is given extensive flexibility to customize what is displayed in the view. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily impact the cost to science if flight rule changes occur. Data Output generates information based on the spacecraft's trajectory, opportunity search results or based on a created observation. The data can be viewed either in tabular format or as a graph. Finally, SOA is unique in that it is designed to be able to communicate with a variety of existing planning and sequencing tools. From the very beginning SOA was designed with the user in mind. Extensive surveys of the potential user community were conducted in order to develop the software requirements. Throughout the development period, close ties have been maintained with the science community to insure that the tool maintains its user focus. Although development is still in its early stages, SOA is already developing a user community on the Cassini project, which is depending on this tool for their science planning. There are other tools at JPL that do various pieces of what SOA can do; however, there is no other tool which combines all these functions and presents them to the user in such a convenient, cohesive, and easy to use fashion.

  15. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project.

    PubMed

    Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse

    2018-04-19

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.

  16. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project

    PubMed Central

    Fernández, Marcos; Poza, Montse

    2018-01-01

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799

  17. Application of in vitro biopharmaceutical methods in development of immediate release oral dosage forms intended for paediatric patients.

    PubMed

    Batchelor, Hannah K; Kendall, Richard; Desset-Brethes, Sabine; Alex, Rainer; Ernest, Terry B

    2013-11-01

    Biopharmaceutics is routinely used in the design and development of medicines to generate science based evidence to predict in vivo performance; the application of this knowledge specifically to paediatric medicines development is yet to be explored. The aim of this review is to present the current status of available biopharmaceutical tools and tests including solubility, permeability and dissolution that may be appropriate for use in the development of immediate release oral paediatric medicines. The existing tools used in adults are discussed together with any limitations for their use within paediatric populations. The results of this review highlight several knowledge gaps in current methodologies in paediatric biopharmaceutics. The authors provide recommendations based on existing knowledge to adapt tests to better represent paediatric patient populations and also provide suggestions for future research that may lead to better tools to evaluate paediatric medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  19. Unlocking the Value of Literature in Health Co-Design: Transforming Patient Experience Publications into a Creative and Accessible Card Tool.

    PubMed

    Villalba, Clare; Jaiprakash, Anjali; Donovan, Jared; Roberts, Jonathan; Crawford, Ross

    2018-05-26

    A wealth of peer-reviewed data exists regarding people's health experience, yet practical ways of using the data to understand patients' experiences and to inform health co-design are needed. This study aims to develop an applied and pragmatic method for using patient experience literature in co-design by transforming it into an accessible and creative co-design tool. A scoping literature review of the CINAHL, MEDLINE, PsycINFO and PubMed electronic databases was conducted from January 2011 through August 2016. Qualitative publications regarding the experience of living with diabetes in Australia were selected. The Results section of each paper was extracted and affinity analysis was applied to identify insights into the health experience. These insights were developed into a card tool for use in health co-design activities. Thirteen relevant papers were identified from the review, and affinity analysis of the Results sections of these papers lead to the identification of 85 insights, from 'Shock of diagnosis' (Insight 1), to 'Delay seeking care' (Insight 9), to 'Assess the quality of care' (Insight 28), to 'Avoid or adapt habits' (Insight 78). Each insight was developed into an individual card, which included a high-level theme, insight, quote and a link back to the literature, together making up the Health Experience Insight Cards, Living with Diabetes Edition. This was the first study to develop a method for transforming existing patient experience literature into a creative tool for health improvement. The Health Experience Insight Cards collate the diverse experiences of over 300 people living with diabetes in Australia, from 13 studies. Health improvement teams can use the 'Living with Diabetes Edition' cards or they can follow this pragmatic method to create their own cards focused on other health experiences to facilitate person-focused health improvements.

  20. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  1. The dynamic analysis of drum roll lathe for machining of rollers

    NASA Astrophysics Data System (ADS)

    Qiao, Zheng; Wu, Dongxu; Wang, Bo; Li, Guo; Wang, Huiming; Ding, Fei

    2014-08-01

    An ultra-precision machine tool for machining of the roller has been designed and assembled, and due to the obvious impact which dynamic characteristic of machine tool has on the quality of microstructures on the roller surface, the dynamic characteristic of the existing machine tool is analyzed in this paper, so is the influence of circumstance that a large scale and slender roller is fixed in the machine on dynamic characteristic of the machine tool. At first, finite element model of the machine tool is built and simplified, and based on that, the paper carries on with the finite element mode analysis and gets the natural frequency and shaking type of four steps of the machine tool. According to the above model analysis results, the weak stiffness systems of machine tool can be further improved and the reasonable bandwidth of control system of the machine tool can be designed. In the end, considering the shock which is caused by Z axis as a result of fast positioning frequently to feeding system and cutting tool, transient analysis is conducted by means of ANSYS analysis in this paper. Based on the results of transient analysis, the vibration regularity of key components of machine tool and its impact on cutting process are explored respectively.

  2. Intrinsic movement variability at work. How long is the path from motor control to design engineering?

    PubMed

    Gaudez, C; Gilles, M A; Savin, J

    2016-03-01

    For several years, increasing numbers of studies have highlighted the existence of movement variability. Before that, it was neglected in movement analysis and it is still almost completely ignored in workstation design. This article reviews motor control theories and factors influencing movement execution, and indicates how intrinsic movement variability is part of task completion. These background clarifications should help ergonomists and workstation designers to gain a better understanding of these concepts, which can then be used to improve design tools. We also question which techniques--kinematics, kinetics or muscular activity--and descriptors are most appropriate for describing intrinsic movement variability and for integration into design tools. By this way, simulations generated by designers for workstation design should be closer to the real movements performed by workers. This review emphasises the complexity of identifying, describing and processing intrinsic movement variability in occupational activities. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    NASA Astrophysics Data System (ADS)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-10-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.

  4. Streamlining the Design Tradespace for Earth Imaging Constellations

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Hughes, Steven P.; Le Moigne, Jacqueline J.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  5. PyHLA: tests for the association between HLA alleles and diseases.

    PubMed

    Fan, Yanhui; Song, You-Qiang

    2017-02-06

    Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.

  6. MOD Tool (Microwave Optics Design Tool)

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl/Tk, which allows the user to work on a choice of platforms (PC, Mac, or Unix) after downloading the Tcl/Tk binary, which is readily available on the web. The MOD Tool server is written using Expect, and it resides on a Sun workstation. Client/server communications are performed over a socket, where upon a connection from a client to the server, the server spawns a child which is be dedicated to communicating with that client. The server communicates with other machines, such as supercomputers using Expect with the username and password being provided by the user on the client.

  7. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  8. Use of computers in dysmorphology.

    PubMed Central

    Diliberti, J H

    1988-01-01

    As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092

  9. Design tool for estimating chemical hydrogen storage system characteristics for light-duty fuel cell vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.

    The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.« less

  10. Maritime Spatial Planning supported by systematic site selection: Applying Marxan for offshore wind power in the western Baltic Sea

    PubMed Central

    Dahl, Karsten; Mohn, Christian

    2018-01-01

    The development of offshore wind energy and other competing interests in sea space are a major incentive for designating marine and coastal areas for specific human activities. Maritime Spatial Planning (MSP) considers human activities at sea in a more integrated way by analysing and designating spatial and temporal distributions of human activities based on ecological, economic and social targets. However, specific tools supporting spatial decisions at sea incorporating all relevant sectors are rarely adopted. The decision support tool Marxan is traditionally used for systematic selection and designation of nature protection and conservation areas. In this study, Marxan was applied as a support tool to identify suitable sites for offshore wind power in the pilot area Pomeranian Bight / Arkona Basin in the western Baltic Sea. The software was successfully tested and scenarios were developed that support the sites indicated in existing national plans, but also show options for alternative developments of offshore wind power in the Pomeranian Bight / Arkona Basin area. PMID:29543878

  11. Knowledge modeling tool for evidence-based design.

    PubMed

    Durmisevic, Sanja; Ciftcioglu, Ozer

    2010-01-01

    The aim of this study is to take evidence-based design (EBD) to the next level by activating available knowledge, integrating new knowledge, and combining them for more efficient use by the planning and design community. This article outlines a framework for a performance-based measurement tool that can provide the necessary decision support during the design or evaluation of a healthcare environment by estimating the overall design performance of multiple variables. New knowledge in EBD adds continuously to complexity (the "information explosion"), and it becomes impossible to consider all aspects (design features) at the same time, much less their impact on final building performance. How can existing knowledge and the information explosion in healthcare-specifically the domain of EBD-be rendered manageable? Is it feasible to create a computational model that considers many design features and deals with them in an integrated way, rather than one at a time? The found evidence is structured and readied for computation through a "fuzzification" process. The weights are calculated using an analytical hierarchy process. Actual knowledge modeling is accomplished through a fuzzy neural tree structure. The impact of all inputs on the outcome-in this case, patient recovery-is calculated using sensitivity analysis. Finally, the added value of the model is discussed using a hypothetical case study of a patient room. The proposed model can deal with the complexities of various aspects and the relationships among variables in a coordinated way, allowing existing and new pieces of evidence to be integrated in a knowledge tree structure that facilitates understanding of the effects of various design interventions on overall design performance.

  12. ART-Ada design project, phase 2

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  13. NASA Simulation Capabilities

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.

    2017-01-01

    This presentation provides a high-level overview of NASA's Future ATM Concepts Evaluation Tool (FACET) with a high-level description of the system's inputs and outputs. This presentation is designed to support the joint simulations that NASA and the Chinese Aeronautical Establishment (CAE) will conduct under an existing Memorandum of Understanding.

  14. Conjoint Analysis: A Tool for Designing Degree Programs.

    ERIC Educational Resources Information Center

    Martin, John; Moore, Thomas E.

    1993-01-01

    Conjoint analysis, commonly used in product development, was used to determine the graduate education needs and program preferences of business administration graduates. Results suggest an accelerated and abbreviated Master's in Business Administration would be preferred to an master's degree, without detracting from existing programs or being…

  15. Designing Cyberbullying Prevention and Mitigation Tools

    ERIC Educational Resources Information Center

    Ashktorab, Zahra

    2017-01-01

    While cyberbullying is prevalent among adolescents, attempts by researchers to evaluate mechanisms for its prevention and mitigation have been largely non-existent. In this dissertation, I argue that the complex nature of cyberbullying, made more challenging by the affordances of diverse social media, cannot be solved through strictly algorithmic…

  16. Intelligent Tutoring Systems for Literacy: Existing Technologies and Continuing Challenges

    ERIC Educational Resources Information Center

    Jacovina, Matthew E.; McNamara, Danielle S.

    2017-01-01

    In this chapter, we describe several intelligent tutoring systems (ITSs) designed to support student literacy through reading comprehension and writing instruction and practice. Although adaptive instruction can be a powerful tool in the literacy domain, developing these technologies poses significant challenges. For example, evaluating the…

  17. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  18. The Development of a Communication Tool to Facilitate the Cancer Trial Recruitment Process and Increase Research Literacy among Underrepresented Populations.

    PubMed

    Torres, Samantha; de la Riva, Erika E; Tom, Laura S; Clayman, Marla L; Taylor, Chirisse; Dong, Xinqi; Simon, Melissa A

    2015-12-01

    Despite increasing need to boost the recruitment of underrepresented populations into cancer trials and biobanking research, few tools exist for facilitating dialogue between researchers and potential research participants during the recruitment process. In this paper, we describe the initial processes of a user-centered design cycle to develop a standardized research communication tool prototype for enhancing research literacy among individuals from underrepresented populations considering enrollment in cancer research and biobanking studies. We present qualitative feedback and recommendations on the prototype's design and content from potential end users: five clinical trial recruiters and ten potential research participants recruited from an academic medical center. Participants were given the prototype (a set of laminated cards) and were asked to provide feedback about the tool's content, design elements, and word choices during semi-structured, in-person interviews. Results suggest that the prototype was well received by recruiters and patients alike. They favored the simplicity, lay language, and layout of the cards. They also noted areas for improvement, leading to card refinements that included the following: addressing additional topic areas, clarifying research processes, increasing the number of diverse images, and using alternative word choices. Our process for refining user interfaces and iterating content in early phases of design may inform future efforts to develop tools for use in clinical research or biobanking studies to increase research literacy.

  19. The design of instructional tools affects secondary school students' learning of cardiopulmonary resuscitation (CPR) in reciprocal peer learning: a randomized controlled trial.

    PubMed

    Iserbyt, Peter; Byra, Mark

    2013-11-01

    Research investigating design effects of instructional tools for learning Basic Life Support (BLS) is almost non-existent. To demonstrate the design of instructional tools matter. The effect of spatial contiguity, a design principle stating that people learn more deeply when words and corresponding pictures are placed close (i.e., integrated) rather than far from each other on a page was investigated on task cards for learning Cardiopulmonary Resuscitation (CPR) during reciprocal peer learning. A randomized controlled trial. A total of 111 students (mean age: 13 years) constituting six intact classes learned BLS through reciprocal learning with task cards. Task cards combine a picture of the skill with written instructions about how to perform it. In each class, students were randomly assigned to the experimental group or the control. In the control, written instructions were placed under the picture on the task cards. In the experimental group, written instructions were placed close to the corresponding part of the picture on the task cards reflecting application of the spatial contiguity principle. One-way analysis of variance found significantly better performances in the experimental group for ventilation volumes (P=.03, ηp2=.10) and flow rates (P=.02, ηp2=.10). For chest compression depth, compression frequency, compressions with correct hand placement, and duty cycles no significant differences were found. This study shows that the design of instructional tools (i.e., task cards) affects student learning. Research-based design of learning tools can enhance BLS and CPR education. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. The servicing aid tool: A teleoperated robotics system for space applications

    NASA Technical Reports Server (NTRS)

    Dorman, Keith W.; Pullen, John L.; Keksz, William O.; Eismann, Paul H.; Kowalski, Keith A.; Karlen, James P.

    1994-01-01

    The Servicing Aid Tool (SAT) is a teleoperated, force-reflecting manipulation system designed for use on the Space Shuttle. The system will assist Extravehicular Activity (EVA) servicing of spacecraft such as the Hubble Space Telescope. The SAT stands out from other robotics development programs in that special attention was given to provide a low-cost, space-qualified design which can easily and inexpensively be reconfigured and/or enhanced through the addition of existing NASA funded technology as that technology matures. SAT components are spaceflight adaptations of existing ground-based designs from Robotics Research Corporation (RRC), the leading supplier of robotics systems to the NASA and university research community in the United States. Fairchild Space is the prime contractor and provides the control electronics, safety system, system integration, and qualification testing. The manipulator consists of a 6-DOF Slave Arm mounted on a 1-DOF Positioning Link in the shuttle payload bay. The Slave Arm is controlled via a highly similar, 6-DOF, force-reflecting Master Arm from Schilling Development, Inc. This work is being performed under contract to the Goddard Space Flight Center Code, Code 442, Hubble Space Telescope Flight Systems and Servicing Project.

  1. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  2. Anatomical information in radiation treatment planning.

    PubMed

    Kalet, I J; Wu, J; Lease, M; Austin-Seymour, M M; Brinkley, J F; Rosse, C

    1999-01-01

    We report on experience and insights gained from prototyping, for clinical radiation oncologists, a new access tool for the University of Washington Digital Anatomist information resources. This access tool is designed to integrate with a radiation therapy planning (RTP) system in use in a clinical setting. We hypothesize that the needs of practitioners in a clinical setting are different from the needs of students, the original targeted users of the Digital Anatomist system, but that a common knowledge resource can serve both. Our prototype was designed to help define those differences and study the feasibility of a full anatomic reference system that will support both clinical radiation therapy and all the existing educational applications.

  3. Sandia Advanced MEMS Design Tools v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.

    This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists somemore » files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  4. A Web-Based Monitoring System for Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Salas, Andrea O.; Weston, Robert P.

    1998-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.

  5. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    NASA Technical Reports Server (NTRS)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  6. Strategy Planning Visualization Tool (SPVT) for the Air Operations Center (AOC). Volume 2: Information Operations (IO) Planning Enhancements

    DTIC Science & Technology

    2009-12-31

    Status and Assessment data interfaces leverage the TBONE Services and data model. The services and supporting Java 2 Platform Enterprise Edition (J2EE...existing Java ™ and .Net developed “Fat Clients.” The IOPC-X design includes an Open Services Gateway Initiative (OSGi) compliant plug-in...J2EE Java 2 Platform Enterprise Edition JAOP Joint Air Operations Plan JAST JAOP AOD Status Tool JFACC Joint Forces Air Component Commander Data

  7. Increasingly mobile: How new technologies can enhance qualitative research

    PubMed Central

    Moylan, Carrie Ann; Derr, Amelia Seraphia; Lindhorst, Taryn

    2015-01-01

    Advances in technology, such as the growth of smart phones, tablet computing, and improved access to the internet have resulted in many new tools and applications designed to increase efficiency and improve workflow. Some of these tools will assist scholars using qualitative methods with their research processes. We describe emerging technologies for use in data collection, analysis, and dissemination that each offer enhancements to existing research processes. Suggestions for keeping pace with the ever-evolving technological landscape are also offered. PMID:25798072

  8. Optimal chroma-like channel design for passive color image splicing detection

    NASA Astrophysics Data System (ADS)

    Zhao, Xudong; Li, Shenghong; Wang, Shilin; Li, Jianhua; Yang, Kongjin

    2012-12-01

    Image splicing is one of the most common image forgeries in our daily life and due to the powerful image manipulation tools, image splicing is becoming easier and easier. Several methods have been proposed for image splicing detection and all of them worked on certain existing color channels. However, the splicing artifacts vary in different color channels and the selection of color model is important for image splicing detection. In this article, instead of finding an existing color model, we propose a color channel design method to find the most discriminative channel which is referred to as optimal chroma-like channel for a given feature extraction method. Experimental results show that both spatial and frequency features extracted from the designed channel achieve higher detection rate than those extracted from traditional color channels.

  9. FASTER - A tool for DSN forecasting and scheduling

    NASA Technical Reports Server (NTRS)

    Werntz, David; Loyola, Steven; Zendejas, Silvino

    1993-01-01

    FASTER (Forecasting And Scheduling Tool for Earth-based Resources) is a suite of tools designed for forecasting and scheduling JPL's Deep Space Network (DSN). The DSN is a set of antennas and other associated resources that must be scheduled for satellite communications, astronomy, maintenance, and testing. FASTER consists of MS-Windows based programs that replace two existing programs (RALPH and PC4CAST). FASTER was designed to be more flexible, maintainable, and user friendly. FASTER makes heavy use of commercial software to allow for customization by users. FASTER implements scheduling as a two pass process: the first pass calculates a predictive profile of resource utilization; the second pass uses this information to calculate a cost function used in a dynamic programming optimization step. This information allows the scheduler to 'look ahead' at activities that are not as yet scheduled. FASTER has succeeded in allowing wider access to data and tools, reducing the amount of effort expended and increasing the quality of analysis.

  10. Verbal and Nonverbal Classroom Communication: The Development of an Observational Instrument.

    ERIC Educational Resources Information Center

    Heger, Herbert K.

    This paper reports the development of a classroom observation instrument designed to broaden and extend the power of existing tools to provide a balanced, reciprocal perspective of both verbal and nonverbal communication. An introductory section discusses developments in communication analysis. The Miniaturized Total Interaction Analysis System…

  11. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  12. Visualization of Learning Scenarios with UML4LD

    ERIC Educational Resources Information Center

    Laforcade, Pierre

    2007-01-01

    Present Educational Modelling Languages are used to formally specify abstract learning scenarios in a machine-interpretable format. Current tooling does not provide teachers/designers with some graphical facilities to help them in reusing existent scenarios. They need human-readable representations. This paper discusses the UML4LD experimental…

  13. Choosing Training?

    ERIC Educational Resources Information Center

    Stephen, Jennifer

    This guide is designed to help the user enter into the job market by making the most of their existing skills and finding additional training. Section 1, Vocations, Occupations, Careers, looks at the assessment tools used by employers and trainers to prepare people for today's job market. It describes how to develop a personal inventory of skills…

  14. Toward the Development of Expert Assessment Systems.

    ERIC Educational Resources Information Center

    Hasselbring, Ted S.

    1986-01-01

    The potential application of "expert systems" to the diagnosis and assessment of special-needs children is examined and existing prototype systems are reviewed. The future of this artificial intelligence technology is discussed in relation to emerging development tools designed for the creation of expert systems by the lay public. (Author)

  15. My child at mealtime: A visually enhanced self-assessment of feeding styles for low-income parents of preschoolers.

    PubMed

    Ontai, Lenna L; Sitnick, Stephanie L; Shilts, Mical K; Townsend, Marilyn S

    2016-04-01

    The importance of caregiver feeding styles on children's dietary outcomes is well documented. However, the instruments used to assess feeding style are limited by high literacy demands, making selfassessment with low-income audiences challenging. The purpose of the current study is to report on the development of My Child at Mealtime (MCMT), a self-assessment tool with reduced literacy demands, designed to measure feeding styles with parents of preschool-aged children. Cognitive interviews were conducted with 44 Head Start parents of 2-5 year old children to develop question wording and identify appropriate visuals. The resulting tool was administered to 119 ethnically diverse, low-income parents of 2-5 year old children. Factor analysis resulted in a two-factor structure that reflects responsiveness and demandingness in a manner consistent with existing assessment tools. Results indicate the final visually enhanced MCMT self-assessment tool provides a measure of parenting style consistent with existing measures, while reducing the literacy demand. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Analysis instruments for the performance of Advanced Practice Nursing.

    PubMed

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  17. A Flexible Online Metadata Editing and Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar, Raul; Pan, Jerry Yun; Gries, Corinna

    2010-01-01

    A metadata editing and management system is being developed employing state of the art XML technologies. A modular and distributed design was chosen for scalability, flexibility, options for customizations, and the possibility to add more functionality at a later stage. The system consists of a desktop design tool or schema walker used to generate code for the actual online editor, a native XML database, and an online user access management application. The design tool is a Java Swing application that reads an XML schema, provides the designer with options to combine input fields into online forms and give the fieldsmore » user friendly tags. Based on design decisions, the tool generates code for the online metadata editor. The code generated is an implementation of the XForms standard using the Orbeon Framework. The design tool fulfills two requirements: First, data entry forms based on one schema may be customized at design time and second data entry applications may be generated for any valid XML schema without relying on custom information in the schema. However, the customized information generated at design time is saved in a configuration file which may be re-used and changed again in the design tool. Future developments will add functionality to the design tool to integrate help text, tool tips, project specific keyword lists, and thesaurus services. Additional styling of the finished editor is accomplished via cascading style sheets which may be further customized and different look-and-feels may be accumulated through the community process. The customized editor produces XML files in compliance with the original schema, however, data from the current page is saved into a native XML database whenever the user moves to the next screen or pushes the save button independently of validity. Currently the system uses the open source XML database eXist for storage and management, which comes with third party online and desktop management tools. However, access to metadata files in the application introduced here is managed in a custom online module, using a MySQL backend accessed by a simple Java Server Faces front end. A flexible system with three grouping options, organization, group and single editing access is provided. Three levels were chosen to distribute administrative responsibilities and handle the common situation of an information manager entering the bulk of the metadata but leave specifics to the actual data provider.« less

  18. A system level model for preliminary design of a space propulsion solid rocket motor

    NASA Astrophysics Data System (ADS)

    Schumacher, Daniel M.

    Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.

  19. A Screening Tool to Identify Spasticity in Need of Treatment

    PubMed Central

    Zorowitz, Richard D.; Wein, Theodore H.; Dunning, Kari; Deltombe, Thierry; Olver, John H.; Davé, Shashank J.; Dimyan, Michael A.; Kelemen, John; Pagan, Fernando L.; Evans, Christopher J.; Gillard, Patrick J.; Kissela, Brett M.

    2017-01-01

    Objective To develop a clinically useful patient-reported screening tool for health care providers to identify patients with spasticity in need of treatment regardless of etiology. Design Eleven spasticity experts participated in a modified Delphi panel and reviewed and revised 2 iterations of a screening tool designed to identify spasticity symptoms and impact on daily function and sleep. Spasticity expert panelists evaluated items pooled from existing questionnaires to gain consensus on the screening tool content. The study also included cognitive interviews of 20 patients with varying spasticity etiologies to determine if the draft screening tool was understandable and relevant to patients with spasticity. Results The Delphi panel reached an initial consensus on 21 of 47 items for the screening tool and determined that the tool should have no more than 11 to 15 items and a 1-month recall period for symptom and impact items. After 2 rounds of review, 13 items were selected and modified by the expert panelists. Most patients (n = 16 [80%]) completed the cognitive interview and interpreted the items as intended. Conclusions Through the use of a Delphi panel and patient interviews, a 13-item spasticity screening tool was developed that will be practical and easy to use in routine clinical practice. PMID:27552355

  20. A Pathway to Freedom: An Evaluation of Screening Tools for the Identification of Trafficking Victims.

    PubMed

    Bespalova, Nadejda; Morgan, Juliet; Coverdale, John

    2016-02-01

    Because training residents and faculty to identify human trafficking victims is a major public health priority, the authors review existing assessment tools. PubMed and Google were searched using combinations of search terms including human, trafficking, sex, labor, screening, identification, and tool. Nine screening tools that met the inclusion criteria were found. They varied greatly in length, format, target demographic, supporting resources, and other parameters. Only two tools were designed specifically for healthcare providers. Only one tool was formally assessed to be valid and reliable in a pilot project in trafficking victim service organizations, although it has not been validated in the healthcare setting. This toolbox should facilitate the education of resident physicians and faculty in screening for trafficking victims, assist educators in assessing screening skills, and promote future research on the identification of trafficking victims.

  1. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  2. Dynamic programming methods for concurrent design and dynamic allocation of vehicles embedded in a system-of-systems

    NASA Astrophysics Data System (ADS)

    Nusawardhana

    2007-12-01

    Recent developments indicate a changing perspective on how systems or vehicles should be designed. Such transition comes from the way decision makers in defense related agencies address complex problems. Complex problems are now often posed in terms of the capabilities desired, rather than in terms of requirements for a single systems. As a result, the way to provide a set of capabilities is through a collection of several individual, independent systems. This collection of individual independent systems is often referred to as a "System of Systems'' (SoS). Because of the independent nature of the constituent systems in an SoS, approaches to design an SoS, and more specifically, approaches to design a new system as a member of an SoS, will likely be different than the traditional design approaches for complex, monolithic (meaning the constituent parts have no ability for independent operation) systems. Because a system of system evolves over time, this simultaneous system design and resource allocation problem should be investigated in a dynamic context. Such dynamic optimization problems are similar to conventional control problems. However, this research considers problems which not only seek optimizing policies but also seek the proper system or vehicle to operate under these policies. This thesis presents a framework and a set of analytical tools to solve a class of SoS problems that involves the simultaneous design of a new system and allocation of the new system along with existing systems. Such a class of problems belongs to the problems of concurrent design and control of a new systems with solutions consisting of both optimal system design and optimal control strategy. Rigorous mathematical arguments show that the proposed framework solves the concurrent design and control problems. Many results exist for dynamic optimization problems of linear systems. In contrary, results on optimal nonlinear dynamic optimization problems are rare. The proposed framework is equipped with the set of analytical tools to solve several cases of nonlinear optimal control problems: continuous- and discrete-time nonlinear problems with applications on both optimal regulation and tracking. These tools are useful when mathematical descriptions of dynamic systems are available. In the absence of such a mathematical model, it is often necessary to derive a solution based on computer simulation. For this case, a set of parameterized decision may constitute a solution. This thesis presents a method to adjust these parameters based on the principle of stochastic approximation simultaneous perturbation using continuous measurements. The set of tools developed here mostly employs the methods of exact dynamic programming. However, due to the complexity of SoS problems, this research also develops suboptimal solution approaches, collectively recognized as approximate dynamic programming solutions, for large scale problems. The thesis presents, explores, and solves problems from an airline industry, in which a new aircraft is to be designed and allocated along with an existing fleet of aircraft. Because the life cycle of an aircraft is on the order of 10 to 20 years, this problem is to be addressed dynamically so that the new aircraft design is the best design for the fleet over a given time horizon.

  3. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  4. Primer-BLAST: A tool to design target-specific primers for polymerase chain reaction

    PubMed Central

    2012-01-01

    Background Choosing appropriate primers is probably the single most important factor affecting the polymerase chain reaction (PCR). Specific amplification of the intended target requires that primers do not have matches to other targets in certain orientations and within certain distances that allow undesired amplification. The process of designing specific primers typically involves two stages. First, the primers flanking regions of interest are generated either manually or using software tools; then they are searched against an appropriate nucleotide sequence database using tools such as BLAST to examine the potential targets. However, the latter is not an easy process as one needs to examine many details between primers and targets, such as the number and the positions of matched bases, the primer orientations and distance between forward and reverse primers. The complexity of such analysis usually makes this a time-consuming and very difficult task for users, especially when the primers have a large number of hits. Furthermore, although the BLAST program has been widely used for primer target detection, it is in fact not an ideal tool for this purpose as BLAST is a local alignment algorithm and does not necessarily return complete match information over the entire primer range. Results We present a new software tool called Primer-BLAST to alleviate the difficulty in designing target-specific primers. This tool combines BLAST with a global alignment algorithm to ensure a full primer-target alignment and is sensitive enough to detect targets that have a significant number of mismatches to primers. Primer-BLAST allows users to design new target-specific primers in one step as well as to check the specificity of pre-existing primers. Primer-BLAST also supports placing primers based on exon/intron locations and excluding single nucleotide polymorphism (SNP) sites in primers. Conclusions We describe a robust and fully implemented general purpose primer design tool that designs target-specific PCR primers. Primer-BLAST offers flexible options to adjust the specificity threshold and other primer properties. This tool is publicly available at http://www.ncbi.nlm.nih.gov/tools/primer-blast. PMID:22708584

  5. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  6. Blood Sugar, Your Pancreas, and Unicorns: The Development of Health Education Materials for Youth With Prediabetes.

    PubMed

    Yazel-Smith, Lisa G; Pike, Julie; Lynch, Dustin; Moore, Courtney; Haberlin, Kathryn; Taylor, Jennifer; Hannon, Tamara S

    2018-05-01

    The obesity epidemic has led to an increase in prediabetes in youth, causing a serious public health concern. Education on diabetes risk and initiation of lifestyle change are the primary treatment modalities. There are few existing age-appropriate health education tools to address diabetes prevention for high-risk youth. To develop an age-appropriate health education tool(s) to help youth better understand type 2 diabetes risk factors and the reversibility of risk. Health education tool development took place in five phases: exploration, design, analysis, refinement, and process evaluation. The project resulted in (1) booklet designed to increase knowledge of risk, (2) meme generator that mirrors the booklet graphics and allows youth to create their own meme based on their pancreas' current mood, (3) environmental posters for clinic, and (4) brief self-assessment that acts as a conversation starter for the health educators. Patients reported high likability and satisfaction with the health education tools, with the majority of patients giving the materials an "A" rating. The process evaluation indicated a high level of fidelity and related measures regarding how the health education tools were intended to be used and how they were actually used in the clinic setting.

  7. Development and evaluation of a patient-centred measurement tool for surgeons' non-technical skills.

    PubMed

    Yule, J; Hill, K; Yule, S

    2018-06-01

    Non-technical skills are essential for safe and effective surgery. Several tools to assess surgeons' non-technical skills from the clinician's perspective have been developed. However, a reliable measurement tool using a patient-centred approach does not currently exist. The aim of this study was to translate the existing Non-Technical Skills for Surgeons (NOTSS) tool into a patient-centred evaluation tool. Data were gathered from four cohorts of patients using an iterative four-stage mixed-methods research design. Exploratory and confirmatory factor analyses were performed to establish the psychometric properties of the tool, focusing on validity, reliability, usability and parsimony. Some 534 patients were recruited to the study. A total of 24 patient-centred non-technical skill items were developed in stage 1, and reduced to nine items in stage 2 using exploratory factor analysis. In stage 3, confirmatory factor analysis demonstrated that these nine items each loaded on to one of three factors, with excellent internal consistency: decision-making, leadership, and communication and teamwork. In stage 4, validity testing established that the new tool was independent of physician empathy and predictive of surgical quality. Surgical leadership emerged as the most dominant skill that patients could recognize and evaluate. A novel nine-item assessment tool has been developed. The Patients' Evaluation of Non-Technical Skills (PENTS) tool allows valid and reliable measurement of surgeons' non-technical skills from the patient perspective. © 2018 BJS Society Ltd Published by John Wiley & Sons Ltd.

  8. Range pattern matching with layer operations and continuous refinements

    NASA Astrophysics Data System (ADS)

    Tseng, I.-Lun; Lee, Zhao Chuan; Li, Yongfu; Perez, Valerio; Tripathi, Vikas; Ong, Jonathan Yoong Seang

    2018-03-01

    At advanced and mainstream process nodes (e.g., 7nm, 14nm, 22nm, and 55nm process nodes), lithography hotspots can exist in layouts of integrated circuits even if the layouts pass design rule checking (DRC). Existence of lithography hotspots in a layout can cause manufacturability issues, which can result in yield losses of manufactured integrated circuits. In order to detect lithography hotspots existing in physical layouts, pattern matching (PM) algorithms and commercial PM tools have been developed. However, there are still needs to use DRC tools to perform PM operations. In this paper, we propose a PM synthesis methodology, which uses a continuous refinement technique, for the automatic synthesis of a given lithography hotspot pattern into a DRC deck, which consists of layer operation commands, so that an equivalent PM operation can be performed by executing the synthesized deck with the use of a DRC tool. Note that the proposed methodology can deal with not only exact patterns, but also range patterns. Also, lithography hotspot patterns containing multiple layers can be processed. Experimental results show that the proposed methodology can accurately and efficiently detect lithography hotspots in physical layouts.

  9. Software engineering techniques and CASE tools in RD13

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.

    1994-12-01

    The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.

  10. OM300 Direction Drilling Module

    DOE Data Explorer

    MacGugan, Doug

    2013-08-22

    OM300 – Geothermal Direction Drilling Navigation Tool: Design and produce a prototype directional drilling navigation tool capable of high temperature operation in geothermal drilling Accuracies of 0.1° Inclination and Tool Face, 0.5° Azimuth Environmental Ruggedness typical of existing oil/gas drilling Multiple Selectable Sensor Ranges High accuracy for navigation, low bandwidth High G-range & bandwidth for Stick-Slip and Chirp detection Selectable serial data communications Reduce cost of drilling in high temperature Geothermal reservoirs Innovative aspects of project Honeywell MEMS* Vibrating Beam Accelerometers (VBA) APS Flux-gate Magnetometers Honeywell Silicon-On-Insulator (SOI) High-temperature electronics Rugged High-temperature capable package and assembly process

  11. A pluggable framework for parallel pairwise sequence search.

    PubMed

    Archuleta, Jeremy; Feng, Wu-chun; Tilevich, Eli

    2007-01-01

    The current and near future of the computing industry is one of multi-core and multi-processor technology. Most existing sequence-search tools have been designed with a focus on single-core, single-processor systems. This discrepancy between software design and hardware architecture substantially hinders sequence-search performance by not allowing full utilization of the hardware. This paper presents a novel framework that will aid the conversion of serial sequence-search tools into a parallel version that can take full advantage of the available hardware. The framework, which is based on a software architecture called mixin layers with refined roles, enables modules to be plugged into the framework with minimal effort. The inherent modular design improves maintenance and extensibility, thus opening up a plethora of opportunities for advanced algorithmic features to be developed and incorporated while routine maintenance of the codebase persists.

  12. VIP: A knowledge-based design aid for the engineering of space systems

    NASA Technical Reports Server (NTRS)

    Lewis, Steven M.; Bellman, Kirstie L.

    1990-01-01

    The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.

  13. High performance TWT development for the microwave power module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whaley, D.R.; Armstrong, C.M.; Groshart, G.

    1996-12-31

    Northrop Grumman`s ongoing development of microwave power modules (MPM) provides microwave power at various power levels, frequencies, and bandwidths for a variety of applications. Present day requirements for the vacuum power booster traveling wave tubes of the microwave power module are becoming increasingly more demanding, necessitating the need for further enhancement of tube performance. The MPM development program at Northrop Grumman is designed specifically to meet this need by construction and test of a series of new tubes aimed at verifying computation and reaching high efficiency design goals. Tubes under test incorporate several different helix designs, as well as varyingmore » electron gun and magnetic confinement configurations. Current efforts also include further development of state-of-the-art TWT modeling and computational methods at Northrop Grumman incorporating new, more accurate models into existing design tools and developing new tools to be used in all aspects of traveling wave tube design. Current status of the Northrop Grumman MPM TWT development program will be presented.« less

  14. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  15. AR4VI: AR as an Accessibility Tool for People with Visual Impairments

    PubMed Central

    Coughlan, James M.; Miele, Joshua

    2017-01-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness – an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well. PMID:29303163

  16. AR4VI: AR as an Accessibility Tool for People with Visual Impairments.

    PubMed

    Coughlan, James M; Miele, Joshua

    2017-10-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness - an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well.

  17. ST-analyzer: a web-based user interface for simulation trajectory analysis.

    PubMed

    Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil

    2014-05-05

    Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. Copyright © 2014 Wiley Periodicals, Inc.

  18. The OME Framework for genome-scale systems biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palsson, Bernhard O.; Ebrahim, Ali; Federowicz, Steve

    The life sciences are undergoing continuous and accelerating integration with computational and engineering sciences. The biology that many in the field have been trained on may be hardly recognizable in ten to twenty years. One of the major drivers for this transformation is the blistering pace of advancements in DNA sequencing and synthesis. These advances have resulted in unprecedented amounts of new data, information, and knowledge. Many software tools have been developed to deal with aspects of this transformation and each is sorely needed [1-3]. However, few of these tools have been forced to deal with the full complexity ofmore » genome-scale models along with high throughput genome- scale data. This particular situation represents a unique challenge, as it is simultaneously necessary to deal with the vast breadth of genome-scale models and the dizzying depth of high-throughput datasets. It has been observed time and again that as the pace of data generation continues to accelerate, the pace of analysis significantly lags behind [4]. It is also evident that, given the plethora of databases and software efforts [5-12], it is still a significant challenge to work with genome-scale metabolic models, let alone next-generation whole cell models [13-15]. We work at the forefront of model creation and systems scale data generation [16-18]. The OME Framework was borne out of a practical need to enable genome-scale modeling and data analysis under a unified framework to drive the next generation of genome-scale biological models. Here we present the OME Framework. It exists as a set of Python classes. However, we want to emphasize the importance of the underlying design as an addition to the discussions on specifications of a digital cell. A great deal of work and valuable progress has been made by a number of communities [13, 19-24] towards interchange formats and implementations designed to achieve similar goals. While many software tools exist for handling genome-scale metabolic models or for genome-scale data analysis, no implementations exist that explicitly handle data and models concurrently. The OME Framework structures data in a connected loop with models and the components those models are composed of. This results in the first full, practical implementation of a framework that can enable genome-scale design-build-test. Over the coming years many more software packages will be developed and tools will necessarily change. However, we hope that the underlying designs shared here can help to inform the design of future software.« less

  19. The LINDSAY Virtual Human Project: An immersive Approach to Anatomy and Physiology

    ERIC Educational Resources Information Center

    Tworek, Janet K.; Jamniczky, Heather A.; Jacob, Christian; Hallgrímsson, Benedikt; Wright, Bruce

    2013-01-01

    The increasing number of digital anatomy teaching software packages challenges anatomy educators on how to best integrate these tools for teaching and learning. Realistically, there exists a complex interplay of design, implementation, politics, and learning needs in the development and integration of software for education, each of which may be…

  20. What Is An Expert System? ERIC Digest.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    This digest describes and defines the various components of an expert system, e.g., a computerized tool designed to enhance the quality and availability of knowledge required by decision makers. It is noted that expert systems differ from conventional applications software in the following areas: (1) the existence of the expert systems shell, or…

  1. Think "E" for Engagement: Use Technology Tools to Design Personalized Professional E-Learning

    ERIC Educational Resources Information Center

    Farris, Shari

    2015-01-01

    As faculty chair of early childhood education at Vanguard University of Southern California, the author was challenged each day by questions: How to provide high-impact online professional learning to adult continuing education students? What barriers exist for adult learners seeking meaningful professional learning? How does practice as a…

  2. The Effects of Integrating Social Learning Environment with Online Learning

    ERIC Educational Resources Information Center

    Raspopovic, Miroslava; Cvetanovic, Svetlana; Medan, Ivana; Ljubojevic, Danijela

    2017-01-01

    The aim of this paper is to present the learning and teaching styles using the Social Learning Environment (SLE), which was developed based on the computer supported collaborative learning approach. To avoid burdening learners with multiple platforms and tools, SLE was designed and developed in order to integrate existing systems, institutional…

  3. Landscape analysis software tools

    Treesearch

    Don Vandendriesche

    2008-01-01

    Recently, several new computer programs have been developed to assist in landscape analysis. The “Sequential Processing Routine for Arraying Yields” (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...

  4. Adjoint-Based Mesh Adaptation for the Sonic Boom Signature Loudness

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.; Park, Michael A.

    2017-01-01

    The mesh adaptation functionality of FUN3D is utilized to obtain a mesh optimized to calculate sonic boom ground signature loudness. During this process, the coupling between the discrete-adjoints of the computational fluid dynamics tool FUN3D and the atmospheric propagation tool sBOOM is exploited to form the error estimate. This new mesh adaptation methodology will allow generation of suitable meshes adapted to reduce the estimated errors in the ground loudness, which is an optimization metric employed in supersonic aircraft design. This new output-based adaptation could allow new insights into meshing for sonic boom analysis and design, and complements existing output-based adaptation techniques such as adaptation to reduce estimated errors in off-body pressure functional. This effort could also have implications for other coupled multidisciplinary adjoint capabilities (e.g., aeroelasticity) as well as inclusion of propagation specific parameters such as prevailing winds or non-standard atmospheric conditions. Results are discussed in the context of existing methods and appropriate conclusions are drawn as to the efficacy and efficiency of the developed capability.

  5. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  6. Using Storyboarding Techniques to Identify Design Opportunities: When Students Employ Storyboards, They Are Better Able to Understand the Complexity of a Products's Use and Visualize Areas for Improvement

    ERIC Educational Resources Information Center

    Reeder, Kevin

    2005-01-01

    The movie industry heavily relies on storyboards as an effective way to visually describe the process of a movie. The storyboard visually describes how the movie flows from beginning to end, how the characters are interacting, and where transitions and/or gaps exist in the storyline. The storyboard is an effective tool in industrial design as…

  7. E-TALEN: a web tool to design TALENs for genome engineering.

    PubMed

    Heigwer, Florian; Kerr, Grainne; Walther, Nike; Glaeser, Kathrin; Pelz, Oliver; Breinig, Marco; Boutros, Michael

    2013-11-01

    Use of transcription activator-like effector nucleases (TALENs) is a promising new technique in the field of targeted genome engineering, editing and reverse genetics. Its applications span from introducing knockout mutations to endogenous tagging of proteins and targeted excision repair. Owing to this wide range of possible applications, there is a need for fast and user-friendly TALEN design tools. We developed E-TALEN (http://www.e-talen.org), a web-based tool to design TALENs for experiments of varying scale. E-TALEN enables the design of TALENs against a single target or a large number of target genes. We significantly extended previously published design concepts to consider genomic context and different applications. E-TALEN guides the user through an end-to-end design process of de novo TALEN pairs, which are specific to a certain sequence or genomic locus. Furthermore, E-TALEN offers a functionality to predict targeting and specificity for existing TALENs. Owing to the computational complexity of many of the steps in the design of TALENs, particular emphasis has been put on the implementation of fast yet accurate algorithms. We implemented a user-friendly interface, from the input parameters to the presentation of results. An additional feature of E-TALEN is the in-built sequence and annotation database available for many organisms, including human, mouse, zebrafish, Drosophila and Arabidopsis, which can be extended in the future.

  8. Pressure distribution under flexible polishing tools. I - Conventional aspheric optics

    NASA Astrophysics Data System (ADS)

    Mehta, Pravin K.; Hufnagel, Robert E.

    1990-10-01

    The paper presents a mathematical model, based on Kirchoff's thin flat plate theory, developed to determine polishing pressure distribution for a flexible polishing tool. A two-layered tool in which bending and compressive stiffnesses are equal is developed, which is formulated as a plate on a linearly elastic foundation. An equivalent eigenvalue problem and solution for a free-free plate are created from the plate formulation. For aspheric, anamorphic optical surfaces, the tool misfit is derived; it is defined as the result of movement from the initial perfect fit on the optic to any other position. The Polisher Design (POD) software for circular tools on aspheric optics is introduced. NASTRAN-based finite element analysis results are compared with the POD software, showing high correlation. By employing existing free-free eigenvalues and eigenfunctions, the work may be extended to rectangular polishing tools as well.

  9. Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

    PubMed Central

    Steinwachs, Donald; Rubin, Haya R

    2003-01-01

    Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658

  10. Focused and Steady-State Characteristics of Shaped Sonic Boom Signatures: Prediction and Analysis

    NASA Technical Reports Server (NTRS)

    Maglieri, Domenic J.; Bobbitt, Percy J.; Massey, Steven J.; Plotkin, Kenneth J.; Kandil, Osama A.; Zheng, Xudong

    2011-01-01

    The objective of this study is to examine the effect of flight, at off-design conditions, on the propagated sonic boom pressure signatures of a small "low-boom" supersonic aircraft. The amplification, or focusing, of the low magnitude "shaped" signatures produced by maneuvers such as the accelerations from transonic to supersonic speeds, climbs, turns, pull-up and pushovers is the concern. To analyze these effects, new and/or improved theoretical tools have been developed, in addition to the use of existing methodology. Several shaped signatures are considered in the application of these tools to the study of selected maneuvers and off-design conditions. The results of these applications are reported in this paper as well as the details of the new analytical tools. Finally, the magnitude of the focused boom problem for "low boom" supersonic aircraft designs has been more accurately quantified and potential "mitigations" suggested. In general, "shaped boom" signatures, designed for cruise flight, such as asymmetric and symmetric flat-top and initial-shock ramp waveforms retain their basic shape during transition flight. Complex and asymmetric and symmetric initial shock ramp waveforms provide lower magnitude focus boom levels than N-waves or asymmetric and symmetric flat-top signatures.

  11. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    NASA Astrophysics Data System (ADS)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  12. Grade 1 to 6 Thai students' existing ideas about light: Across-age study

    NASA Astrophysics Data System (ADS)

    Horasirt, Yupaporn; Yuenyong, Chokchai

    2018-01-01

    This paper aimed to investigate Grade 1 to 6 Thai (6 - 12 years old) students' existing ideas about light, sight, vision, source of light. The participants included 36 Grade 1 to 6 students (6 students in each Grade) who studying at a primary school in Khon Kaen. The method of this study is a descriptive qualitative research design. The tools included the two-tiered test about light and open-ended question. Students' responses were categorized the students' existing ideas about light. Findings indicated that young students held various existing ideas about light that could be categorized into 6 different groups relating to sight, vision, and source of light. The paper discussed these students' existing ideas for developing constructivist learning about light in Thailand context.

  13. Implementing an integrated engineering data base system: A developer's experience and the application to IPAD

    NASA Technical Reports Server (NTRS)

    Bruce, E. A.

    1980-01-01

    The software developed by the IPAD project, a new and very powerful tool for the implementation of integrated Computer Aided Design (CAD) systems in the aerospace engineering community, is discussed. The IPAD software is a tool and, as such, can be well applied or misapplied in any particular environment. The many benefits of an integrated CAD system are well documented, but there are few such systems in existence, especially in the mechanical engineering disciplines, and therefore little available experience to guide the implementor.

  14. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth scientists seeking to leverage the capabilities of multiple spacecraft working in support of a common goal.

  15. Ergonomic risk assessment with DesignCheck to evaluate assembly work in different phases of the vehicle development process.

    PubMed

    Winter, Gabriele; Schaub, Karlheinz G; Großmann, Kay; Laun, Gerhard; Landau, Kurt; Bruder, Ralph

    2012-01-01

    Occupational hazards exist, if the design of the work situation is not in accordance with ergonomic design principles. At assembly lines ergonomics is applied to the design of work equipment and tasks and to work organisation. The ignoring of ergonomic principles in planning and design of assembly work leads to unfavourable working posture, action force and material handling. Disorders of the musculoskeletal system are of a common occurrence throughout Europe. Musculoskeletal disorders are a challenge against the background of disabled workers. The changes in a worker's capability have to be regarded in the conception of redesigned and new assembly lines. In this way ergonomics becomes progressively more important in planning and design of vehicles: The objective of ergonomic design in different stages of the vehicles development process is to achieve an optimal adaptation of the assembly work to workers. Hence the ergonomic screening tool "Design Check" (DC) was developed to identify ergonomic deficits in workplace layouts. The screening-tool is based on the current ergonomic state of the art in the design of physical work and relevant EU legal requirements. It was tested within a federal German research project at selected work stations at the assembly lines at Dr.-Ing. h.c. F. Porsche AG / Stuttgart. Meanwhile the application of the screening-tool DC is transferred in other parts of the Porsche AG, Stuttgart. It is also realized as an ergonomic standard method to perform assembly work in different phases of the vehicle development process.

  16. Predicting Minimum Control Speed on the Ground (VMCG) and Minimum Control Airspeed (VMCA) of Engine Inoperative Flight Using Aerodynamic Database and Propulsion Database Generators

    NASA Astrophysics Data System (ADS)

    Hadder, Eric Michael

    There are many computer aided engineering tools and software used by aerospace engineers to design and predict specific parameters of an airplane. These tools help a design engineer predict and calculate such parameters such as lift, drag, pitching moment, takeoff range, maximum takeoff weight, maximum flight range and much more. However, there are very limited ways to predict and calculate the minimum control speeds of an airplane in engine inoperative flight. There are simple solutions, as well as complicated solutions, yet there is neither standard technique nor consistency throughout the aerospace industry. To further complicate this subject, airplane designers have the option of using an Automatic Thrust Control System (ATCS), which directly alters the minimum control speeds of an airplane. This work addresses this issue with a tool used to predict and calculate the Minimum Control Speed on the Ground (VMCG) as well as the Minimum Control Airspeed (VMCA) of any existing or design-stage airplane. With simple line art of an airplane, a program called VORLAX is used to generate an aerodynamic database used to calculate the stability derivatives of an airplane. Using another program called Numerical Propulsion System Simulation (NPSS), a propulsion database is generated to use with the aerodynamic database to calculate both VMCG and VMCA. This tool was tested using two airplanes, the Airbus A320 and the Lockheed Martin C130J-30 Super Hercules. The A320 does not use an Automatic Thrust Control System (ATCS), whereas the C130J-30 does use an ATCS. The tool was able to properly calculate and match known values of VMCG and VMCA for both of the airplanes. The fact that this tool was able to calculate the known values of VMCG and VMCA for both airplanes means that this tool would be able to predict the VMCG and VMCA of an airplane in the preliminary stages of design. This would allow design engineers the ability to use an Automatic Thrust Control System (ATCS) as part of the design of an airplane and still have the ability to predict the VMCG and VMCA of the airplane.

  17. Numerical framework for the modeling of electrokinetic flows

    NASA Astrophysics Data System (ADS)

    Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.

    1998-09-01

    This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.

  18. A Systematic Review of Reporting Tools Applicable to Sexual and Reproductive Health Programmes: Step 1 in Developing Programme Reporting Standards

    PubMed Central

    Ali, Moazzam; Chandra-Mouli, Venkatraman; Tran, Nhan; Gülmezoglu, A. Metin

    2015-01-01

    Background Complete and accurate reporting of programme preparation, implementation and evaluation processes in the field of sexual and reproductive health (SRH) is essential to understand the impact of SRH programmes, as well as to guide their replication and scale-up. Objectives To provide an overview of existing reporting tools and identify core items used in programme reporting with a focus on programme preparation, implementation and evaluation processes. Methods A systematic review was completed for the period 2000–2014. Reporting guidelines, checklists and tools, irrespective of study design, applicable for reporting on programmes targeting SRH outcomes, were included. Two independent reviewers screened the title and abstract of all records. Full texts were assessed in duplicate, followed by data extraction on the focus, content area, year of publication, validation and description of reporting items. Data was synthesized using an iterative thematic approach, where items related to programme preparation, implementation and evaluation in each tool were extracted and aggregated into a consolidated list. Results Out of the 3,656 records screened for title and abstracts, full texts were retrieved for 182 articles, out of which 108 were excluded. Seventy-four full text articles corresponding to 45 reporting tools were retained for synthesis. The majority of tools were developed for reporting on intervention research (n = 15), randomized controlled trials (n = 8) and systematic reviews (n = 7). We identified a total of 50 reporting items, across three main domains and corresponding sub-domains: programme preparation (objective/focus, design, piloting); programme implementation (content, timing/duration/location, providers/staff, participants, delivery, implementation outcomes), and programme evaluation (process evaluation, implementation barriers/facilitators, outcome/impact evaluation). Conclusions Over the past decade a wide range of tools have been developed to improve the reporting of health research. Development of Programme Reporting Standards (PRS) for SRH can fill a significant gap in existing reporting tools. This systematic review is the first step in the development of such standards. In the next steps, we will draft a preliminary version of the PRS based on the aggregate list of identified items, and finalize the tool using a consensus process among experts and user-testing. PMID:26418859

  19. A Systematic Review of Reporting Tools Applicable to Sexual and Reproductive Health Programmes: Step 1 in Developing Programme Reporting Standards.

    PubMed

    Kågesten, Anna; Tunçalp, Ӧzge; Ali, Moazzam; Chandra-Mouli, Venkatraman; Tran, Nhan; Gülmezoglu, A Metin

    2015-01-01

    Complete and accurate reporting of programme preparation, implementation and evaluation processes in the field of sexual and reproductive health (SRH) is essential to understand the impact of SRH programmes, as well as to guide their replication and scale-up. To provide an overview of existing reporting tools and identify core items used in programme reporting with a focus on programme preparation, implementation and evaluation processes. A systematic review was completed for the period 2000-2014. Reporting guidelines, checklists and tools, irrespective of study design, applicable for reporting on programmes targeting SRH outcomes, were included. Two independent reviewers screened the title and abstract of all records. Full texts were assessed in duplicate, followed by data extraction on the focus, content area, year of publication, validation and description of reporting items. Data was synthesized using an iterative thematic approach, where items related to programme preparation, implementation and evaluation in each tool were extracted and aggregated into a consolidated list. Out of the 3,656 records screened for title and abstracts, full texts were retrieved for 182 articles, out of which 108 were excluded. Seventy-four full text articles corresponding to 45 reporting tools were retained for synthesis. The majority of tools were developed for reporting on intervention research (n = 15), randomized controlled trials (n = 8) and systematic reviews (n = 7). We identified a total of 50 reporting items, across three main domains and corresponding sub-domains: programme preparation (objective/focus, design, piloting); programme implementation (content, timing/duration/location, providers/staff, participants, delivery, implementation outcomes), and programme evaluation (process evaluation, implementation barriers/facilitators, outcome/impact evaluation). Over the past decade a wide range of tools have been developed to improve the reporting of health research. Development of Programme Reporting Standards (PRS) for SRH can fill a significant gap in existing reporting tools. This systematic review is the first step in the development of such standards. In the next steps, we will draft a preliminary version of the PRS based on the aggregate list of identified items, and finalize the tool using a consensus process among experts and user-testing.

  20. A four stage approach for ontology-based health information system design.

    PubMed

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  2. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  3. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  4. Rig Diagnostic Tools

    NASA Technical Reports Server (NTRS)

    Soileau, Kerry M.; Baicy, John W.

    2008-01-01

    Rig Diagnostic Tools is a suite of applications designed to allow an operator to monitor the status and health of complex networked systems using a unique interface between Java applications and UNIX scripts. The suite consists of Java applications, C scripts, Vx- Works applications, UNIX utilities, C programs, and configuration files. The UNIX scripts retrieve data from the system and write them to a certain set of files. The Java side monitors these files and presents the data in user-friendly formats for operators to use in making troubleshooting decisions. This design allows for rapid prototyping and expansion of higher-level displays without affecting the basic data-gathering applications. The suite is designed to be extensible, with the ability to add new system components in building block fashion without affecting existing system applications. This allows for monitoring of complex systems for which unplanned shutdown time comes at a prohibitive cost.

  5. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  6. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  7. A Tool for Multiple Targeted Genome Deletions that Is Precise, Scar-Free, and Suitable for Automation.

    PubMed

    Aubrey, Wayne; Riley, Michael C; Young, Michael; King, Ross D; Oliver, Stephen G; Clare, Amanda

    2015-01-01

    Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method's primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome.

  8. A Tool for Multiple Targeted Genome Deletions that Is Precise, Scar-Free, and Suitable for Automation

    PubMed Central

    Aubrey, Wayne; Riley, Michael C.; Young, Michael; King, Ross D.; Oliver, Stephen G.; Clare, Amanda

    2015-01-01

    Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method’s primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome. PMID:26630677

  9. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  10. Process evaluation of software using the international classification of external causes of injuries for collecting burn injury data at burn centers in the United States.

    PubMed

    Villaveces, Andrés; Peck, Michael; Faraklas, Iris; Hsu-Chang, Naiwei; Joe, Victor; Wibbenmeyer, Lucy

    2014-01-01

    Detailed information on the cause of burns is necessary to construct effective prevention programs. The International Classification of External Causes of Injury (ICECI) is a data collection tool that allows comprehensive categorization of multiple facets of injury events. The objective of this study was to conduct a process evaluation of software designed to improve the ease of use of the ICECI so as to identify key additional variables useful for understanding the occurrence of burn injuries, and compare this software with existing data-collection practices conducted for burn injuries. The authors completed a process evaluation of the implementation and ease of use of the software in six U.S. burn centers. They also collected preliminary burn injury data and compared them with existing variables reported to the American Burn Association's National Burn Repository (NBR). The authors accomplished their goals of 1) creating a data-collection tool for the ICECI, which can be linked to existing operational programs of the NBR, 2) training registrars in the use of this tool, 3) establishing quality-control mechanisms for ensuring accuracy and reliability, 4) incorporating ICECI data entry into the weekly routine of the burn registrar, and 5) demonstrating the quality differences between data collected using this tool and the NBR. Using this or similar tools with the ICECI structure or key selected variables can improve the quantity and quality of data on burn injuries in the United States and elsewhere and thus can be more useful in informing prevention strategies.

  11. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  12. Remote Control of Neuronal Signaling

    PubMed Central

    Rogan, Sarah C.

    2011-01-01

    A significant challenge for neuroscientists is to determine how both electrical and chemical signals affect the activity of cells and circuits and how the nervous system subsequently translates that activity into behavior. Remote, bidirectional manipulation of those signals with high spatiotemporal precision is an ideal approach to addressing that challenge. Neuroscientists have recently developed a diverse set of tools that permit such experimental manipulation with varying degrees of spatial, temporal, and directional control. These tools use light, peptides, and small molecules to primarily activate ion channels and G protein-coupled receptors (GPCRs) that in turn activate or inhibit neuronal firing. By monitoring the electrophysiological, biochemical, and behavioral effects of such activation/inhibition, researchers can better understand the links between brain activity and behavior. Here, we review the tools that are available for this type of experimentation. We describe the development of the tools and highlight exciting in vivo data. We focus primarily on designer GPCRs (receptors activated solely by synthetic ligands, designer receptors exclusively activated by designer drugs) and microbial opsins (e.g., channelrhodopsin-2, halorhodopsin, Volvox carteri channelrhodopsin) but also describe other novel techniques that use orthogonal receptors, caged ligands, allosteric modulators, and other approaches. These tools differ in the direction of their effect (activation/inhibition, hyperpolarization/depolarization), their onset and offset kinetics (milliseconds/minutes/hours), the degree of spatial resolution they afford, and their invasiveness. Although none of these tools is perfect, each has advantages and disadvantages, which we describe, and they are all still works in progress. We conclude with suggestions for improving upon the existing tools. PMID:21415127

  13. A Multi-Collaborative Ambient Assisted Living Service Description Tool

    PubMed Central

    Falcó, Jorge L.; Vaquerizo, Esteban; Artigas, José Ignacio

    2014-01-01

    Collaboration among different stakeholders is a key factor in the design of Ambient Assisted Living (AAL) environments and services. Throughout several AAL projects we have found repeated difficulties in this collaboration and have learned lessons by the experience of solving real situations. This paper highlights identified critical items for collaboration among technicians, users, company and institutional stakeholders and proposes as a communication tool for a project steering committee a service description tool which includes information from the different fields in comprehensible format for the others. It was first generated in the MonAMI project to promote understanding among different workgroups, proven useful there, and further tested later in some other smaller AAL projects. The concept of scalable service description has proven useful for understanding of different disciplines and for participatory decision making throughout the projects to adapt to singularities and partial successes or faults of each action. This paper introduces such tool, relates with existing methodologies in cooperation in AAL and describes it with a example to offer to AAL community. Further work on this tool will significantly improve results in user-centered design of sustainable services in AAL. PMID:24897409

  14. Podcasting in medical education: can we turn this toy into an effective learning tool?

    PubMed

    Zanussi, Lauren; Paget, Mike; Tworek, Janet; McLaughlin, Kevin

    2012-10-01

    Advances in information technology have changed how we deliver medical education, sometimes for the better, sometimes not. Technologies that were designed for purposes other than education, such as podcasting, are now frequently used in medical education. In this article, the authors discuss the pros and cons of adapting existing technologies for medical education, caution against limiting evaluation of technologies to the level of rater satisfaction, and suggest a research agenda for formally evaluating the role of existing and future technologies in medical education.

  15. SolarTherm: A flexible Modelica-based simulator for CSP systems

    NASA Astrophysics Data System (ADS)

    Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John

    2017-06-01

    Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.

  16. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    PubMed

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  17. Using patient engagement in the design and rationale of a trial for women with depression in obstetrics and gynecology practices.

    PubMed

    Poleshuck, Ellen; Wittink, Marsha; Crean, Hugh; Gellasch, Tara; Sandler, Mardy; Bell, Elaine; Juskiewicz, Iwona; Cerulli, Catherine

    2015-07-01

    Significant health disparities exist among socioeconomically disadvantaged women, who experience elevated rates of depression and increased risk for poor depression treatment engagement and outcomes. We aimed to use stakeholder input to develop innovative methods for a comparative effectiveness trial to address the needs of socioeconomically disadvantaged women with depression in women's health practices. Using a community advisory board, focus groups, and individual patient input, we determined the feasibility and acceptability of an electronic psychosocial screening and referral tool; developed and finalized a prioritization tool for women with depression; and piloted the prioritization tool. Two intervention approaches, enhanced screening and referral using an electronic psychosocial screening, and mentoring using the prioritization tool, were developed as intervention options for socioeconomically disadvantaged women attending women's health practices. We describe the developmental steps and the final design for the comparative effectiveness trial evaluating both intervention approaches. Stakeholder input allowed us to develop an acceptable clinical trial of two patient-centered interventions with patient-driven outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  19. CMS-2 Reverse Engineering and ENCORE/MODEL Integration

    DTIC Science & Technology

    1992-05-01

    Automated extraction of design information from an existing software system written in CMS-2 can be used to document that system as-built, and that I The...extracted information is provided by a commer- dally available CASE tool. * Information describing software system design is automatically extracted...the displays in Figures 1, 2, and 3. T achiev ths GE 11 b iuo w as rjcs CM-2t Aa nsltr(M2da 1 n Joia Reverse EwngiernTcnlg 5RT [2GRE] . Two xampe fD

  20. Engine dynamic analysis with general nonlinear finite element codes

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1991-01-01

    A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.

  1. Understanding and Mitigating Vortex-Dominated, Tip-Leakage and End-Wall Losses in a Transonic Splittered Rotor Stage

    DTIC Science & Technology

    2015-04-23

    blade geometry parameters the TPL design 9   tool was initiated by running the MATLAB script (*.m) Main_SpeedLine_Auto. Main_SpeedLine_Auto...SolidWorks for solid model generation of the blade shapes. Computational Analysis With solid models generated of the gas -path air wedge, automated...287 mm (11.3 in) Constrained by existing TCR geometry Number of Passages 12 None A blade tip-down design approach was used. The outputs of the

  2. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  3. EUV Focus Sensor: Design and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using amore » single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.« less

  4. A graphical approach to radio frequency quadrupole design

    NASA Astrophysics Data System (ADS)

    Turemen, G.; Unel, G.; Yasatekin, B.

    2015-07-01

    The design of a radio frequency quadrupole, an important section of all ion accelerators, and the calculation of its beam dynamics properties can be achieved using the existing computational tools. These programs, originally designed in 1980s, show effects of aging in their user interfaces and in their output. The authors believe there is room for improvement in both design techniques using a graphical approach and in the amount of analytical calculations before going into CPU burning finite element analysis techniques. Additionally an emphasis on the graphical method of controlling the evolution of the relevant parameters using the drag-to-change paradigm is bound to be beneficial to the designer. A computer code, named DEMIRCI, has been written in C++ to demonstrate these ideas. This tool has been used in the design of Turkish Atomic Energy Authority (TAEK)'s 1.5 MeV proton beamline at Saraykoy Nuclear Research and Training Center (SANAEM). DEMIRCI starts with a simple analytical model, calculates the RFQ behavior and produces 3D design files that can be fed to a milling machine. The paper discusses the experience gained during design process of SANAEM Project Prometheus (SPP) RFQ and underlines some of DEMIRCI's capabilities.

  5. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  6. Blended Learning in Design Education: An Analysis of Students' Experiences within the Disciplinary Differences Framework

    ERIC Educational Resources Information Center

    Pektas, Sule Tasli; Gürel, Meltem Ö.

    2014-01-01

    Blended learning has already become an indispensable part of education in many fields. However, the majority of existing research on blended learning has assumed homogeneity of disciplines. This study suggests that research highlighting disciplinary effects and differences is much needed to effectively develop courses and tools consonant with the…

  7. Designing a Qualitative Data Collection Strategy (QDCS) for Africa - Phase 1: A Gap Analysis of Existing Models, Simulations, and Tools Relating to Africa

    DTIC Science & Technology

    2012-06-01

    generalized behavioral model characterized after the fictional Seldon equations (the one elaborated upon by Isaac Asimov in the 1951 novel, The...Foundation). Asimov described the Seldon equations as essentially statistical models with historical data of a sufficient size and variability that they

  8. Evaluation of a Mobile Learning Organiser for University Students

    ERIC Educational Resources Information Center

    Corlett, Dan; Sharples, Mike; Bull, Susan; Chan, Tony

    2005-01-01

    This paper describes a 10-month trial of a mobile learning organiser, developed for use by university students. Implemented on a wireless-enabled Pocket PC hand-held computer, the organiser makes use of existing mobile applications as well as tools designed specifically for students to manage their learning. The trial set out to identify the…

  9. Evaluation of Computerised Reading-Assistance Systems for Reading Japanese Texts--From a Linguistic Point of View

    ERIC Educational Resources Information Center

    Toyoda, Etsuko

    2016-01-01

    For second-language learners to effectively and efficiently gather information from online texts in their target language, a well-designed computerised system to assist their reading is essential. While many articles and websites which introduce electronic second-language learning tools exist, evaluation of their functions in relation to the…

  10. Podcasting in Medical Education: Can We Turn This Toy into an Effective Learning Tool?

    ERIC Educational Resources Information Center

    Zanussi, Lauren; Paget, Mike; Tworek, Janet; McLaughlin, Kevin

    2012-01-01

    Advances in information technology have changed how we deliver medical education, sometimes for the better, sometimes not. Technologies that were designed for purposes other than education, such as podcasting, are now frequently used in medical education. In this article, the authors discuss the pros and cons of adapting existing technologies for…

  11. Marrying Two Existing Software Packages into an Efficient Online Tutoring Tool

    ERIC Educational Resources Information Center

    Byrne, Timothy

    2007-01-01

    Many teachers today use Learning Management Systems (LMS), several of which are open-source. Specific examples are Claroline and Moodle. However, they are not specifically designed for language learning, and hence not entirely suitable. In this article, I will compare two uses of the Claroline LMS available at Louvain-la-Neuve within the framework…

  12. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    ERIC Educational Resources Information Center

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…

  13. EPCAL: ETS Platform for Collaborative Assessment and Learning. Research Report. ETS RR-17-49

    ERIC Educational Resources Information Center

    Hao, Jiangang; Liu, Lei; von Davier, Alina A.; Lederer, Nathan; Zapata-Rivera, Diego; Jaki, Peter; Bakkenson, Michael

    2017-01-01

    Most existing software tools for online collaboration are designed to support the collaboration itself instead of the study of collaboration with a systematic team and task management system. In this report, we identify six important features for a platform to facilitate the study of online collaboration. We then introduce the Educational Testing…

  14. Optimizing RF gun cavity geometry within an automated injector design system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability becausemore » EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.« less

  15. Disrupted rhythms and mobile ICT in a surgical department.

    PubMed

    Hasvold, Per Erlend; Scholl, Jeremiah

    2011-08-01

    This study presents a study of mobile information and communication technology (ICT) for healthcare professionals in a surgical ward. The purpose of the study was to create a participatory design process to investigate factors that affect the acceptance of mobile ICT in a surgical ward. Observations, interviews, a participatory design process, and pilot testing of a prototype of a co-constructed application were used. Informal rhythms existed at the department that facilitated that people met and interacted several times throughout the day. These gatherings allowed for opportunistic encounters that were extensively used for dialogue, problem solving, coordination, message and logistics handling. A prototype based on handheld mobile computers was introduced. The tool supported information seeking functionality that previously required local mobility. By making the nurses more freely mobile, the tool disrupted these informal rhythms. This created dissatisfaction with the system, and lead to discussion and introduction of other arenas to solve coordination and other problems. Mobile ICT tools may break down informal communication and coordination structures. This may reduce the efficiency of the new tools, or contribute to resistance towards such systems. In some situations however such "disrupted rhythms" may be overcome by including additional sociotechnical mechanisms in the overall design to counteract this negative side-effect. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Object-oriented Approach to High-level Network Monitoring and Management

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    2000-01-01

    An absolute prerequisite for the management of large investigating methods to build high-level monitoring computer networks is the ability to measure their systems that are built on top of existing monitoring performance. Unless we monitor a system, we cannot tools. Due to the heterogeneous nature of the hope to manage and control its performance. In this underlying systems at NASA Langley Research Center, paper, we describe a network monitoring system that we use an object-oriented approach for the design, we are currently designing and implementing. Keeping, first, we use UML (Unified Modeling Language) to in mind the complexity of the task and the required model users' requirements. Second, we identify the flexibility for future changes, we use an object-oriented existing capabilities of the underlying monitoring design methodology. The system is built using the system. Third, we try to map the former with the latter. APIs offered by the HP OpenView system.

  17. An assessment of survey measures used across key epidemiologic studies of United States Gulf War I Era Veterans

    PubMed Central

    2013-01-01

    Over the past two decades, 12 large epidemiologic studies and 2 registries have focused on U.S. veterans of the 1990–1991 Gulf War Era. We conducted a review of these studies’ research tools to identify existing gaps and overlaps of efforts to date, and to advance development of the next generation of Gulf War Era survey tools. Overall, we found that many of the studies used similar instruments. Questions regarding exposures were more similar across studies than other domains, while neurocognitive and psychological tools were the most variable. Many studies focused on self-reported survey results, with a range of validation practices. However, physical exams, biomedical assessments, and specimen storage were not common. This review suggests that while research may be able to pool data from past surveys, future surveys need to consider how their design can yield data comparable with previous surveys. Additionally, data that incorporate recent technologies in specimen and genetic analyses would greatly enhance such survey data. When combined with existing data on deployment-related exposures and post-deployment health conditions, longitudinal follow-up of existing studies within this collaborative framework could represent an important step toward improving the health of veterans. PMID:23302181

  18. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  19. The COLA Collision Avoidance Method

    NASA Astrophysics Data System (ADS)

    Assmann, K.; Berger, J.; Grothkopp, S.

    2009-03-01

    In the following we present a collision avoidance method named COLA. The method has been designed to predict collisions for Earth orbiting spacecraft on any orbits, including orbit changes, with other space-born objects. The point in time of a collision and the collision probability are determined. To guarantee effective processing the COLA method uses a modular design and is composed of several components which are either developed within this work or deduced from existing algorithms: A filtering module, the close approach determination, the collision detection and the collision probability calculation. A software tool which implements the COLA method has been verified using various test cases built from sample missions. This software has been implemented in the C++ programming language and serves as a universal collision detection tool at LSE Space Engineering & Operations AG.

  20. A Multirate Control Strategy to the Slow Sensors Problem: An Interactive Simulation Tool for Controller Assisted Design

    PubMed Central

    Salt, Julián; Cuenca, Ángel; Palau, Francisco; Dormido, Sebastián

    2014-01-01

    In many control applications, the sensor technology used for the measurement of the variable to be controlled is not able to maintain a restricted sampling period. In this context, the assumption of regular and uniform sampling pattern is questionable. Moreover, if the control action updating can be faster than the output measurement frequency in order to fulfill the proposed closed loop behavior, the solution is usually a multirate controller. There are some known aspects to be careful of when a multirate system (MR) is going to be designed. The proper multiplicity between input-output sampling periods, the proper controller structure, the existence of ripples and others issues need to be considered. A useful way to save time and achieve good results is to have an assisted computer design tool. An interactive simulation tool to deal with MR seems to be the right solution. In this paper this kind of simulation application is presented. It allows an easy understanding of the performance degrading or improvement when changing the multirate sampling pattern parameters. The tool was developed using Sysquake, a Matlab-like language with fast execution and powerful graphic facilities. It can be delivered as an executable. In the paper a detailed explanation of MR treatment is also included and the design of four different MR controllers with flexible structure to be adapted to different schemes will also be presented. The Smith's predictor in these MR schemes is also explained, justified and used when time delays appear. Finally some interesting observations achieved using this interactive tool are included. PMID:24583971

  1. Lung cancer in symptomatic patients presenting in primary care: a systematic review of risk prediction tools

    PubMed Central

    Schmidt-Hansen, Mia; Berendse, Sabine; Hamilton, Willie; Baldwin, David R

    2017-01-01

    Background Lung cancer is the leading cause of cancer deaths. Around 70% of patients first presenting to specialist care have advanced disease, at which point current treatments have little effect on survival. The issue for primary care is how to recognise patients earlier and investigate appropriately. This requires an assessment of the risk of lung cancer. Aim The aim of this study was to systematically review the existing risk prediction tools for patients presenting in primary care with symptoms that may indicate lung cancer Design and setting Systematic review of primary care data. Method Medline, PreMedline, Embase, the Cochrane Library, Web of Science, and ISI Proceedings (1980 to March 2016) were searched. The final list of included studies was agreed between two of the authors, who also appraised and summarised them. Results Seven studies with between 1482 and 2 406 127 patients were included. The tools were all based on UK primary care data, but differed in complexity of development, number/type of variables examined/included, and outcome time frame. There were four multivariable tools with internal validation area under the curves between 0.88 and 0.92. The tools all had a number of limitations, and none have been externally validated, or had their clinical and cost impact examined. Conclusion There is insufficient evidence for the recommendation of any one of the available risk prediction tools. However, some multivariable tools showed promising discrimination. What is needed to guide clinical practice is both external validation of the existing tools and a comparative study, so that the best tools can be incorporated into clinical decision tools used in primary care. PMID:28483820

  2. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  3. Rapid Prototyping of High Performance Signal Processing Applications

    NASA Astrophysics Data System (ADS)

    Sane, Nimish

    Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high-level application specification consisting of topological patterns in various aspects of the design flow. 2. We have formulated the core functional dataflow (CFDF) model of computation, which can be used to model a wide variety of deterministic dynamic dataflow behaviors. We have also presented key features of the CFDF model and tools based on these features. These tools provide support for heterogeneous dataflow behaviors, an intuitive and common framework for functional specification, support for functional simulation, portability from several existing dataflow models to CFDF, integrated emphasis on minimally-restricted specification of actor functionality, and support for efficient static, quasi-static, and dynamic scheduling techniques. 3. We have developed a generalized scheduling technique for CFDF graphs based on decomposition of a CFDF graph into static graphs that interact at run-time. Furthermore, we have refined this generalized scheduling technique using a new notion of "mode grouping," which better exposes the underlying static behavior. We have also developed a scheduling technique for a class of dynamic applications that generates parameterized looped schedules (PLSs), which can handle dynamic dataflow behavior without major limitations on compile-time predictability. 4. We have demonstrated the use of dataflow-based approaches for design and implementation of radio astronomy DSP systems using an application example of a tunable digital downconverter (TDD) for spectrometers. Design and implementation of this module has been an integral part of this thesis work. This thesis demonstrates a design flow that consists of a high-level software prototype, analysis, and simulation using the dataflow interchange format (DIF) tool, and integration of this design with the existing tool flow for the target implementation on an FPGA platform, called interconnect break-out board (IBOB). We have also explored the trade-off between low hardware cost for fixed configurations of digital downconverters and flexibility offered by TDD designs. 5. This thesis has contributed significantly to the development and release of the latest version of a graph package oriented toward models of computation (MoCGraph). Our enhancements to this package include support for tree data structures, and generalized schedule trees (GSTs), which provide a useful data structure for a wide variety of schedule representations. Our extensions to the MoCGraph package provided key support for the CFDF model, and functional simulation capabilities in the DIF package.

  4. Community Health Environment Scan Survey (CHESS): a novel tool that captures the impact of the built environment on lifestyle factors.

    PubMed

    Wong, Fiona; Stevens, Denise; O'Connor-Duffany, Kathleen; Siegel, Karen; Gao, Yue

    2011-03-07

    Novel efforts and accompanying tools are needed to tackle the global burden of chronic disease. This paper presents an approach to describe the environments in which people live, work, and play. Community Health Environment Scan Survey (CHESS) is an empirical assessment tool that measures the availability and accessibility, of healthy lifestyle options lifestyle options. CHESS reveals existing community assets as well as opportunities for change, shaping community intervention planning efforts by focusing on community-relevant opportunities to address the three key risk factors for chronic disease (i.e. unhealthy diet, physical inactivity, and tobacco use). The CHESS tool was developed following a review of existing auditing tools and in consultation with experts. It is based on the social-ecological model and is adaptable to diverse settings in developed and developing countries throughout the world. For illustrative purposes, baseline results from the Community Interventions for Health (CIH) Mexico site are used, where the CHESS tool assessed 583 food stores and 168 restaurants. Comparisons between individual-level survey data from schools and community-level CHESS data are made to demonstrate the utility of the tool in strategically guiding intervention activities. The environments where people live, work, and play are key factors in determining their diet, levels of physical activity, and tobacco use. CHESS is the first tool of its kind that systematically and simultaneously examines how built environments encourage/discourage healthy eating, physical activity, and tobacco use. CHESS can help to design community interventions to prevent chronic disease and guide healthy urban planning. © 2011 Fiona Wong et al.

  5. A requirements specification for a software design support system

    NASA Technical Reports Server (NTRS)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  6. Application of Reduced Order Transonic Aerodynamic Influence Coefficient Matrix for Design Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley W.

    2009-01-01

    Supporting the Aeronautics Research Mission Directorate guidelines, the National Aeronautics and Space Administration [NASA] Dryden Flight Research Center is developing a multidisciplinary design, analysis, and optimization [MDAO] tool. This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Today s modern aircraft designs in transonic speed are a challenging task due to the computation time required for the unsteady aeroelastic analysis using a Computational Fluid Dynamics [CFD] code. Design approaches in this speed regime are mainly based on the manual trial and error. Because of the time required for unsteady CFD computations in time-domain, this will considerably slow down the whole design process. These analyses are usually performed repeatedly to optimize the final design. As a result, there is considerable motivation to be able to perform aeroelastic calculations more quickly and inexpensively. This paper will describe the development of unsteady transonic aeroelastic design methodology for design optimization using reduced modeling method and unsteady aerodynamic approximation. The method requires the unsteady transonic aerodynamics be represented in the frequency or Laplace domain. Dynamically linear assumption is used for creating Aerodynamic Influence Coefficient [AIC] matrices in transonic speed regime. Unsteady CFD computations are needed for the important columns of an AIC matrix which corresponded to the primary modes for the flutter. Order reduction techniques, such as Guyan reduction and improved reduction system, are used to reduce the size of problem transonic flutter can be found by the classic methods, such as Rational function approximation, p-k, p, root-locus etc. Such a methodology could be incorporated into MDAO tool for design optimization at a reasonable computational cost. The proposed technique is verified using the Aerostructures Test Wing 2 actually designed, built, and tested at NASA Dryden Flight Research Center. The results from the full order model and the approximate reduced order model are analyzed and compared.

  7. Verification of a rapid mooring and foundation design tool

    DOE PAGES

    Weller, Sam D.; Hardwick, Jon; Gomez, Steven; ...

    2018-02-15

    Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less

  8. Verification of a rapid mooring and foundation design tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weller, Sam D.; Hardwick, Jon; Gomez, Steven

    Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less

  9. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    PubMed

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.

  10. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast

    PubMed Central

    2015-01-01

    Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932

  11. International standards on working postures and movements ISO 11226 and EN 1005-4.

    PubMed

    Delleman, N J; Dul, J

    2007-11-01

    Standards organizations have given considerable attention to the problem of work-related musculoskeletal disorders. The publication of international standards for evaluating working postures and movements, ISO 11,226 in 2000 and EN 1,005-4 in 2005, may be considered as a support for those involved in preventing and controlling these disorders. The first one is a tool for evaluation of existing work situations, whereas the latter one is a tool for evaluation during a design/engineering process. Key publications and considerations that led to the content of the standards are presented, followed by examples of application.

  12. Custom implant design for large cranial defects.

    PubMed

    Marreiros, Filipe M M; Heuzé, Y; Verius, M; Unterhofer, C; Freysinger, W; Recheis, W

    2016-12-01

    The aim of this work was to introduce a computer-aided design (CAD) tool that enables the design of large skull defect (>100 [Formula: see text]) implants. Functional and aesthetically correct custom implants are extremely important for patients with large cranial defects. For these cases, preoperative fabrication of implants is recommended to avoid problems of donor site morbidity, sufficiency of donor material and quality. Finally, crafting the correct shape is a non-trivial task increasingly complicated by defect size. We present a CAD tool to design such implants for the neurocranium. A combination of geometric morphometrics and radial basis functions, namely thin-plate splines, allows semiautomatic implant generation. The method uses symmetry and the best fitting shape to estimate missing data directly within the radiologic volume data. In addition, this approach delivers correct implant fitting via a boundary fitting approach. This method generates a smooth implant surface, free of sharp edges that follows the main contours of the boundary, enabling accurate implant placement in the defect site intraoperatively. The present approach is evaluated and compared to existing methods. A mean error of 89.29 % (72.64-100 %) missing landmarks with an error less or equal to 1 mm was obtained. In conclusion, the results show that our CAD tool can generate patient-specific implants with high accuracy.

  13. Investigation of aerodynamic design issues with regions of separated flow

    NASA Technical Reports Server (NTRS)

    Gally, Tom

    1993-01-01

    Existing aerodynamic design methods have generally concentrated on the optimization of airfoil or wing shapes to produce a minimum drag while satisfying some basic constraints such as lift, pitching moment, or thickness. Since the minimization of drag almost always precludes the existence of separated flow, the evaluation and validation of these design methods for their robustness and accuracy when separated flow is present has not been aggressively pursued. However, two new applications for these design tools may be expected to include separated flow and the issues of aerodynamic design with this feature must be addressed. The first application of the aerodynamic design tools is the design of airfoils or wings to provide an optimal performance over a wide range of flight conditions (multipoint design). While the definition of 'optimal performance' in the multipoint setting is currently being hashed out, it is recognized that given a wide range of flight conditions, it will not be possible to ensure a minimum drag constraint at all conditions, and in fact some amount of separated flow (presumably small) may have to be allowed at the more demanding flight conditions. Thus a multipoint design method must be tolerant of the existence of separated flow and may include some controls upon its extent. The second application is in the design of wings with extended high speed buffet boundaries of their flight envelopes. Buffet occurs on a wing when regions of flow separation have grown to the extent that their time varying pressures induce possible destructive effects upon the wing structure or adversely effect either the aircraft controllability or passenger comfort. A conservative approach to the expansion of the buffet flight boundary is to simply expand the flight envelope of nonseparated flow under the assumption that buffet will also thus be alleviated. However, having the ability to design a wing with separated flow and thus to control the location, extent and severity of the separated flow regions may allow aircraft manufacturers to gain an advantage in the early design stages of an aircraft, when configuration changes are relatively inexpensive to make. The goal of the summer research at NASA Langley Research Center (LaRC) was twofold: first, to investigate a particular airfoil design problem observed under conditions of strong shock induced flow separation on the upper surface of an airfoil at transonic conditions; and second, to suggest and investigate design methodologies for the prediction (or detection) and control of flow separation. The context of both investigations was to use an existing two dimensional Navier-Stokes flow solver and the constrained direct/iterative surface curvature (CDISC) design algorithm developed at LaRC. As a lead in to the primary task, it was necessary to gain a familiarity with both the design method and the computational analysis and to perform the FORTRAN coding needed to couple them together.

  14. siMacro: A Fast and Easy Data Processing Tool for Cell-Based Genomewide siRNA Screens.

    PubMed

    Singh, Nitin Kumar; Seo, Bo Yeun; Vidyasagar, Mathukumalli; White, Michael A; Kim, Hyun Seok

    2013-03-01

    Growing numbers of studies employ cell line-based systematic short interfering RNA (siRNA) screens to study gene functions and to identify drug targets. As multiple sources of variations that are unique to siRNA screens exist, there is a growing demand for a computational tool that generates normalized values and standardized scores. However, only a few tools have been available so far with limited usability. Here, we present siMacro, a fast and easy-to-use Microsoft Office Excel-based tool with a graphic user interface, designed to process single-condition or two-condition synthetic screen datasets. siMacro normalizes position and batch effects, censors outlier samples, and calculates Z-scores and robust Z-scores, with a spreadsheet output of >120,000 samples in under 1 minute.

  15. siMacro: A Fast and Easy Data Processing Tool for Cell-Based Genomewide siRNA Screens

    PubMed Central

    Singh, Nitin Kumar; Seo, Bo Yeun; Vidyasagar, Mathukumalli; White, Michael A.

    2013-01-01

    Growing numbers of studies employ cell line-based systematic short interfering RNA (siRNA) screens to study gene functions and to identify drug targets. As multiple sources of variations that are unique to siRNA screens exist, there is a growing demand for a computational tool that generates normalized values and standardized scores. However, only a few tools have been available so far with limited usability. Here, we present siMacro, a fast and easy-to-use Microsoft Office Excel-based tool with a graphic user interface, designed to process single-condition or two-condition synthetic screen datasets. siMacro normalizes position and batch effects, censors outlier samples, and calculates Z-scores and robust Z-scores, with a spreadsheet output of >120,000 samples in under 1 minute. PMID:23613684

  16. A Tutorial for Analyzing Human Reaction Times: How to Filter Data, Manage Missing Values, and Choose a Statistical Model

    ERIC Educational Resources Information Center

    Lachaud, Christian Michel; Renaud, Olivier

    2011-01-01

    This tutorial for the statistical processing of reaction times collected through a repeated-measure design is addressed to researchers in psychology. It aims at making explicit some important methodological issues, at orienting researchers to the existing solutions, and at providing them some evaluation tools for choosing the most robust and…

  17. Supporting Teaching and Learning Via the Web: Transforming Hard-Copy Linear Mindsets into Web-Flexible Creative Thinking.

    ERIC Educational Resources Information Center

    Borkowski, Ellen Yu; Henry, David; Larsen, Lida L.; Mateik, Deborah

    This paper describes a four-tiered approach to supporting University of Maryland faculty in the development of instructional materials to be delivered via the World Wide Web. The approach leverages existing equipment and staff by the design of Web posting, editing, and management tools for use on the campus-wide information server,…

  18. The Magic Bullet: A Tool for Assessing and Evaluating Learning Potential in Games

    ERIC Educational Resources Information Center

    Becker, Katrin

    2011-01-01

    This paper outlines a simple and effective model that can be used to evaluate and design educational digital games. It also facilitates the formulation of strategies for using existing games in learning contexts. The model categorizes game goals and learning objectives into one or more of four possible categories. An overview of the model is…

  19. Science Writing Heuristics Embedded in Green Chemistry: A Tool to Nurture Environmental Literacy among Pre-University Students

    ERIC Educational Resources Information Center

    Shamuganathana, Sheila; Karpudewan, Mageswary

    2017-01-01

    Existing studies report on the importance of instilling environmental literacy among students from an early stage of schooling to enable them to adopt more pro-environmental behaviors in the near future. This quasi-experimental study was designed to compare the level of environmental literacy among two groups of students: the experimental group (N…

  20. Simulation of recreational use in backcountry settings: an aid to management planning

    Treesearch

    David N. Cole

    2002-01-01

    Simulation models of recreation use patterns can be a valuable tool to managers of backcountry areas, such as wilderness areas and national parks. They can help fine-tune existing management programs, particularly in places that ration recreation use or that require the use of designated campsites. They can assist managers in evaluating the likely effects of increasing...

  1. Offshore wind farm layout optimization

    NASA Astrophysics Data System (ADS)

    Elkinton, Christopher Neil

    Offshore wind energy technology is maturing in Europe and is poised to make a significant contribution to the U.S. energy production portfolio. Building on the knowledge the wind industry has gained to date, this dissertation investigates the influences of different site conditions on offshore wind farm micrositing---the layout of individual turbines within the boundaries of a wind farm. For offshore wind farms, these conditions include, among others, the wind and wave climates, water depths, and soil conditions at the site. An analysis tool has been developed that is capable of estimating the cost of energy (COE) from offshore wind farms. For this analysis, the COE has been divided into several modeled components: major costs (e.g. turbines, electrical interconnection, maintenance, etc.), energy production, and energy losses. By treating these component models as functions of site-dependent parameters, the analysis tool can investigate the influence of these parameters on the COE. Some parameters result in simultaneous increases of both energy and cost. In these cases, the analysis tool was used to determine the value of the parameter that yielded the lowest COE and, thus, the best balance of cost and energy. The models have been validated and generally compare favorably with existing offshore wind farm data. The analysis technique was then paired with optimization algorithms to form a tool with which to design offshore wind farm layouts for which the COE was minimized. Greedy heuristic and genetic optimization algorithms have been tuned and implemented. The use of these two algorithms in series has been shown to produce the best, most consistent solutions. The influences of site conditions on the COE have been studied further by applying the analysis and optimization tools to the initial design of a small offshore wind farm near the town of Hull, Massachusetts. The results of an initial full-site analysis and optimization were used to constrain the boundaries of the farm. A more thorough optimization highlighted the features of the area that would result in a minimized COE. The results showed reasonable layout designs and COE estimates that are consistent with existing offshore wind farms.

  2. Use of Semantic Technology to Create Curated Data Albums

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Li, Xiang; Sainju, Roshan; Bakare, Rohan; Basyal, Sabin

    2014-01-01

    One of the continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available online. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the data sets they need can obtain the specific files using these systems. However, in cases where researchers are interested in studying an event of research interest, they must manually assemble a variety of relevant data sets by searching the different distributed data systems. Consequently, there is a need to design and build specialized search and discover tools in Earth science that can filter through large volumes of distributed online data and information and only aggregate the relevant resources needed to support climatology and case studies. This paper presents a specialized search and discovery tool that automatically creates curated Data Albums. The tool was designed to enable key elements of the search process such as dynamic interaction and sense-making. The tool supports dynamic interaction via different modes of interactivity and visual presentation of information. The compilation of information and data into a Data Album is analogous to a shoebox within the sense-making framework. This tool automates most of the tedious information/data gathering tasks for researchers. Data curation by the tool is achieved via an ontology-based, relevancy ranking algorithm that filters out nonrelevant information and data. The curation enables better search results as compared to the simple keyword searches provided by existing data systems in Earth science.

  3. Use of Semantic Technology to Create Curated Data Albums

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Li, Xiang; Sainju, Roshan; Bakare, Rohan; Basyal, Sabin; Fox, Peter (Editor); Norack, Tom (Editor)

    2014-01-01

    One of the continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available online. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the data sets they need can obtain the specific files using these systems. However, in cases where researchers are interested in studying an event of research interest, they must manually assemble a variety of relevant data sets by searching the different distributed data systems. Consequently, there is a need to design and build specialized search and discovery tools in Earth science that can filter through large volumes of distributed online data and information and only aggregate the relevant resources needed to support climatology and case studies. This paper presents a specialized search and discovery tool that automatically creates curated Data Albums. The tool was designed to enable key elements of the search process such as dynamic interaction and sense-making. The tool supports dynamic interaction via different modes of interactivity and visual presentation of information. The compilation of information and data into a Data Album is analogous to a shoebox within the sense-making framework. This tool automates most of the tedious information/data gathering tasks for researchers. Data curation by the tool is achieved via an ontology-based, relevancy ranking algorithm that filters out non-relevant information and data. The curation enables better search results as compared to the simple keyword searches provided by existing data systems in Earth science.

  4. Automatic Design of Digital Synthetic Gene Circuits

    PubMed Central

    Marchisio, Mario A.; Stelling, Jörg

    2011-01-01

    De novo computational design of synthetic gene circuits that achieve well-defined target functions is a hard task. Existing, brute-force approaches run optimization algorithms on the structure and on the kinetic parameter values of the network. However, more direct rational methods for automatic circuit design are lacking. Focusing on digital synthetic gene circuits, we developed a methodology and a corresponding tool for in silico automatic design. For a given truth table that specifies a circuit's input–output relations, our algorithm generates and ranks several possible circuit schemes without the need for any optimization. Logic behavior is reproduced by the action of regulatory factors and chemicals on the promoters and on the ribosome binding sites of biological Boolean gates. Simulations of circuits with up to four inputs show a faithful and unequivocal truth table representation, even under parametric perturbations and stochastic noise. A comparison with already implemented circuits, in addition, reveals the potential for simpler designs with the same function. Therefore, we expect the method to help both in devising new circuits and in simplifying existing solutions. PMID:21399700

  5. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  6. Middle Mississippi River decision support system: user's manual

    USGS Publications Warehouse

    Rohweder, Jason J.; Zigler, Steven J.; Fox, Timothy J.; Hulse, Steven N.

    2005-01-01

    This user's manual describes the Middle Mississippi River Decision Support System (MMRDSS) and gives detailed examples on its use. The MMRDSS provides a framework to assist decision makers regarding natural resource issues in the Middle Mississippi River floodplain. The MMRDSS is designed to provide users with a spatially explicit tool for tasks, such as inventorying existing knowledge, developing models to investigate the potential effects of management decisions, generating hypotheses to advance scientific understanding, and developing scientifically defensible studies and monitoring. The MMRDSS also includes advanced tools to assist users in evaluating differences in complexity, connectivity, and structure of aquatic habitats among river reaches. The Environmental Systems Research Institute ArcView 3.x platform was used to create and package the data and tools of the MMRDSS.

  7. Developing tools and resources for the biomedical domain of the Greek language.

    PubMed

    Vagelatos, Aristides; Mantzari, Elena; Pantazara, Mavina; Tsalidis, Christos; Kalamara, Chryssoula

    2011-06-01

    This paper presents the design and implementation of terminological and specialized textual resources that were produced in the framework of the Greek research project "IATROLEXI". The aim of the project was to create the critical infrastructure for the Greek language, i.e. linguistic resources and tools for use in high level Natural Language Processing (NLP) applications in the domain of biomedicine. The project was built upon existing resources developed by the project partners and further enhanced within its framework, i.e. a Greek morphological lexicon of about 100,000 words, and language processing tools such as a lemmatiser and a morphosyntactic tagger. Christos Tsalidis, Additionally, it developed new assets, such as a specialized corpus of biomedical texts and an ontology of medical terminology.

  8. Infrastructure for the design and fabrication of MEMS for RF/microwave and millimeter wave applications

    NASA Astrophysics Data System (ADS)

    Nerguizian, Vahe; Rafaf, Mustapha

    2004-08-01

    This article describes and provides valuable information for companies and universities with strategies to start fabricating MEMS for RF/Microwave and millimeter wave applications. The present work shows the infrastructure developed for RF/Microwave and millimeter wave MEMS platforms, which helps the identification, evaluation and selection of design tools and fabrication foundries taking into account packaging and testing. The selected and implemented simple infrastructure models, based on surface and bulk micromachining, yield inexpensive and innovative approaches for distributed choices of MEMS operating tools. With different educational or industrial institution needs, these models may be modified for specific resource changes using a careful analyzed iteration process. The inputs of the project are evaluation selection criteria and information sources such as financial, technical, availability, accessibility, simplicity, versatility and practical considerations. The outputs of the project are the selection of different MEMS design tools or software (solid modeling, electrostatic/electromagnetic and others, compatible with existing standard RF/Microwave design tools) and different MEMS manufacturing foundries. Typical RF/Microwave and millimeter wave MEMS solutions are introduced on the platform during the evaluation and development phases of the project for the validation of realistic results and operational decision making choices. The encountered challenges during the investigation and the development steps are identified and the dynamic behavior of the infrastructure is emphasized. The inputs (resources) and the outputs (demonstrated solutions) are presented in tables and flow chart mode diagrams.

  9. DeviceEditor visual biological CAD canvas

    PubMed Central

    2012-01-01

    Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390

  10. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    NASA Astrophysics Data System (ADS)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  11. A knowledge-based design framework for airplane conceptual and preliminary design

    NASA Astrophysics Data System (ADS)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane. Using these tools will show an improvement in efficiency over using separate programs due to the automatic recalculation with any change of input data. The direct visual feedback of 3D geometry in the AAA-AML, will lead to quicker resolving of problems as opposed to conventional methods.

  12. An Overview of Tools for Creating, Validating and Using PDS Metadata

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.

    2017-12-01

    NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.

  13. Yucca Mountain licensing support network archive assistant.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunlavy, Daniel M.; Bauer, Travis L.; Verzi, Stephen J.

    2008-03-01

    This report describes the Licensing Support Network (LSN) Assistant--a set of tools for categorizing e-mail messages and documents, and investigating and correcting existing archives of categorized e-mail messages and documents. The two main tools in the LSN Assistant are the LSN Archive Assistant (LSNAA) tool for recategorizing manually labeled e-mail messages and documents and the LSN Realtime Assistant (LSNRA) tool for categorizing new e-mail messages and documents. This report focuses on the LSNAA tool. There are two main components of the LSNAA tool. The first is the Sandia Categorization Framework, which is responsible for providing categorizations for documents in anmore » archive and storing them in an appropriate Categorization Database. The second is the actual user interface, which primarily interacts with the Categorization Database, providing a way for finding and correcting categorizations errors in the database. A procedure for applying the LSNAA tool and an example use case of the LSNAA tool applied to a set of e-mail messages are provided. Performance results of the categorization model designed for this example use case are presented.« less

  14. indCAPS: A tool for designing screening primers for CRISPR/Cas9 mutagenesis events.

    PubMed

    Hodgens, Charles; Nimchuk, Zachary L; Kieber, Joseph J

    2017-01-01

    Genetic manipulation of organisms using CRISPR/Cas9 technology generally produces small insertions/deletions (indels) that can be difficult to detect. Here, we describe a technique to easily and rapidly identify such indels. Sequence-identified mutations that alter a restriction enzyme recognition site can be readily distinguished from wild-type alleles using a cleaved amplified polymorphic sequence (CAPS) technique. If a restriction site is created or altered by the mutation such that only one allele contains the restriction site, a polymerase chain reaction (PCR) followed by a restriction digest can be used to distinguish the two alleles. However, in the case of most CRISPR-induced alleles, no such restriction sites are present in the target sequences. In this case, a derived CAPS (dCAPS) approach can be used in which mismatches are purposefully introduced in the oligonucleotide primers to create a restriction site in one, but not both, of the amplified templates. Web-based tools exist to aid dCAPS primer design, but when supplied sequences that include indels, the current tools often fail to suggest appropriate primers. Here, we report the development of a Python-based, species-agnostic web tool, called indCAPS, suitable for the design of PCR primers used in dCAPS assays that is compatible with indels. This tool should have wide utility for screening editing events following CRISPR/Cas9 mutagenesis as well as for identifying specific editing events in a pool of CRISPR-mediated mutagenesis events. This tool was field-tested in a CRISPR mutagenesis experiment targeting a cytokinin receptor (AHK3) in Arabidopsis thaliana. The tool suggested primers that successfully distinguished between wild-type and edited alleles of a target locus and facilitated the isolation of two novel ahk3 null alleles. Users can access indCAPS and design PCR primers to employ dCAPS to identify CRISPR/Cas9 alleles at http://indcaps.kieber.cloudapps.unc.edu/.

  15. Tools for assessing risk of reporting biases in studies and syntheses of studies: a systematic review

    PubMed Central

    Page, Matthew J; McKenzie, Joanne E; Higgins, Julian P T

    2018-01-01

    Background Several scales, checklists and domain-based tools for assessing risk of reporting biases exist, but it is unclear how much they vary in content and guidance. We conducted a systematic review of the content and measurement properties of such tools. Methods We searched for potentially relevant articles in Ovid MEDLINE, Ovid Embase, Ovid PsycINFO and Google Scholar from inception to February 2017. One author screened all titles, abstracts and full text articles, and collected data on tool characteristics. Results We identified 18 tools that include an assessment of the risk of reporting bias. Tools varied in regard to the type of reporting bias assessed (eg, bias due to selective publication, bias due to selective non-reporting), and the level of assessment (eg, for the study as a whole, a particular result within a study or a particular synthesis of studies). Various criteria are used across tools to designate a synthesis as being at ‘high’ risk of bias due to selective publication (eg, evidence of funnel plot asymmetry, use of non-comprehensive searches). However, the relative weight assigned to each criterion in the overall judgement is unclear for most of these tools. Tools for assessing risk of bias due to selective non-reporting guide users to assess a study, or an outcome within a study, as ‘high’ risk of bias if no results are reported for an outcome. However, assessing the corresponding risk of bias in a synthesis that is missing the non-reported outcomes is outside the scope of most of these tools. Inter-rater agreement estimates were available for five tools. Conclusion There are several limitations of existing tools for assessing risk of reporting biases, in terms of their scope, guidance for reaching risk of bias judgements and measurement properties. Development and evaluation of a new, comprehensive tool could help overcome present limitations. PMID:29540417

  16. Science Opportunity Analyzer (SOA): Science Planning Made Simple

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Polanskey, Carol A.

    2004-01-01

    .For the first time at JPL, the Cassini mission to Saturn is using distributed science operations for developing their experiments. Remote scientists needed the ability to: a) Identify observation opportunities; b) Create accurate, detailed designs for their observations; c) Verify that their designs meet their objectives; d) Check their observations against project flight rules and constraints; e) Communicate their observations to other scientists. Many existing tools provide one or more of these functions, but Science Opportunity Analyzer (SOA) has been built to unify these tasks into a single application. Accurate: Utilizes JPL Navigation and Ancillary Information Facility (NAIF) SPICE* software tool kit - Provides high fidelity modeling. - Facilitates rapid adaptation to other flight projects. Portable: Available in Unix, Windows and Linux. Adaptable: Designed to be a multi-mission tool so it can be readily adapted to other flight projects. Implemented in Java, Java 3D and other innovative technologies. Conclusion: SOA is easy to use. It only requires 6 simple steps. SOA's ability to show the same accurate information in multiple ways (multiple visualization formats, data plots, listings and file output) is essential to meet the needs of a diverse, distributed science operations environment.

  17. An Interactive Preliminary Design System of High Speed Forebody and Inlet Flows

    NASA Technical Reports Server (NTRS)

    Liou, May-Fun; Benson, Thomas J.; Trefny, Charles J.

    2010-01-01

    This paper demonstrates a simulation-based aerodynamic design process of high speed inlet. A genetic algorithm is integrated into the design process to facilitate the single objective optimization. The objective function is the total pressure recovery and is obtained by using a PNS solver for its computing efficiency. The system developed uses existing software of geometry definition, mesh generation and CFD analysis. The process which produces increasingly desirable design in each genetic evolution over many generations is automatically carried out. A generic two-dimensional inlet is created as a showcase to demonstrate the capabilities of this tool. A parameterized study of geometric shape and size of the showcase is also presented.

  18. Genetic Constructor: An Online DNA Design Platform.

    PubMed

    Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli

    2017-12-15

    Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.

  19. Evolution-Inspired Computational Design of Symmetric Proteins.

    PubMed

    Voet, Arnout R D; Simoncini, David; Tame, Jeremy R H; Zhang, Kam Y J

    2017-01-01

    Monomeric proteins with a number of identical repeats creating symmetrical structures are potentially very valuable building blocks with a variety of bionanotechnological applications. As such proteins do not occur naturally, the emerging field of computational protein design serves as an excellent tool to create them from nonsymmetrical templates. Existing pseudo-symmetrical proteins are believed to have evolved from oligomeric precursors by duplication and fusion of identical repeats. Here we describe a computational workflow to reverse-engineer this evolutionary process in order to create stable proteins consisting of identical sequence repeats.

  20. Computational analysis of liquid hypergolic propellant rocket engines

    NASA Technical Reports Server (NTRS)

    Krishnan, A.; Przekwas, A. J.; Gross, K. W.

    1992-01-01

    The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.

  1. Influence of Immersive Human Scale Architectural Representation on Design Judgment

    NASA Astrophysics Data System (ADS)

    Elder, Rebecca L.

    Unrealistic visual representation of architecture within our existing environments have lost all reference to the human senses. As a design tool, visual and auditory stimuli can be utilized to determine human's perception of design. This experiment renders varying building inputs within different sites, simulated with corresponding immersive visual and audio sensory cues. Introducing audio has been proven to influence the way a person perceives a space, yet most inhabitants rely strictly on their sense of vision to make design judgments. Though not as apparent, users prefer spaces that have a better quality of sound and comfort. Through a series of questions, we can begin to analyze whether a design is fit for both an acoustic and visual environment.

  2. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.

  3. Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS)

    PubMed Central

    Downes, Martin J; Brennan, Marnie L; Williams, Hywel C; Dean, Rachel S

    2016-01-01

    Objectives The aim of this study was to develop a critical appraisal (CA) tool that addressed study design and reporting quality as well as the risk of bias in cross-sectional studies (CSSs). In addition, the aim was to produce a help document to guide the non-expert user through the tool. Design An initial scoping review of the published literature and key epidemiological texts was undertaken prior to the formation of a Delphi panel to establish key components for a CA tool for CSSs. A consensus of 80% was required from the Delphi panel for any component to be included in the final tool. Results An initial list of 39 components was identified through examination of existing resources. An international Delphi panel of 18 medical and veterinary experts was established. After 3 rounds of the Delphi process, the Appraisal tool for Cross-Sectional Studies (AXIS tool) was developed by consensus and consisted of 20 components. A detailed explanatory document was also developed with the tool, giving expanded explanation of each question and providing simple interpretations and examples of the epidemiological concepts being examined in each question to aid non-expert users. Conclusions CA of the literature is a vital step in evidence synthesis and therefore evidence-based decision-making in a number of different disciplines. The AXIS tool is therefore unique and was developed in a way that it can be used across disciplines to aid the inclusion of CSSs in systematic reviews, guidelines and clinical decision-making. PMID:27932337

  4. Bat detective-Deep learning tools for bat acoustic signal detection.

    PubMed

    Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E

    2018-03-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.

  5. Bat detective—Deep learning tools for bat acoustic signal detection

    PubMed Central

    Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.

    2018-01-01

    Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076

  6. Meeting report: Ocean 'omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013).

    PubMed

    Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F

    2014-06-15

    The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.

  7. Meeting report: Ocean ‘omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013)

    PubMed Central

    Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.

    2014-01-01

    The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495

  8. TaN resistor process development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Kathleen; Martinez, Marino John; Clevenger, Jascinda

    This paper describes the development and implementation of an integrated resistor process based on reactively sputtered tantalum nitride. Image reversal lithography was shown to be a superior method for liftoff patterning of these films. The results of a response surface DOE for the sputter deposition of the films are discussed. Several approaches to stabilization baking were examined and the advantages of the hot plate method are shown. In support of a new capability to produce special-purpose HBT-based Small-Scale Integrated Circuits (SSICs), we developed our existing TaN resistor process, designed for research prototyping, into one with greater maturity and robustness. Includedmore » in this work was the migration of our TaN deposition process from a research-oriented tool to a tool more suitable for production. Also included was implementation and optimization of a liftoff process for the sputtered TaN to avoid the complicating effects of subtractive etching over potentially sensitive surfaces. Finally, the method and conditions for stabilization baking of the resistors was experimentally determined to complete the full implementation of the resistor module. Much of the work to be described involves the migration between sputter deposition tools - from a Kurt J. Lesker CMS-18 to a Denton Discovery 550. Though they use nominally the same deposition technique (reactive sputtering of Ta with N{sup +} in a RF-excited Ar plasma), they differ substantially in their design and produce clearly different results in terms of resistivity, conformity of the film and the difference between as-deposited and stabilized films. We will describe the design of and results from the design of experiments (DOE)-based method of process optimization on the new tool and compare this to what had been used on the old tool.« less

  9. Communication Styles of Interactive Tools for Self-Improvement.

    PubMed

    Niess, Jasmin; Diefenbach, Sarah

    Interactive products for self-improvement (e.g., online trainings to reduce stress, fitness gadgets) have become increasingly popular among consumers and healthcare providers. In line with the idea of positive computing, these tools aim to support their users on their way to improved well-being and human flourishing. As an interdisciplinary domain, the design of self-improvement technologies requires psychological, technological, and design expertise. One needs to know how to support people in behavior change, and one needs to find ways to do this through technology design. However, as recent reviews show, the interlocking relationship between these disciplines is still improvable. Many existing technologies for self-improvement neglect psychological theory on behavior change, especially motivational factors are not sufficiently considered. To counteract this, we suggest a focus on the dialog and emerging communication between product and user, considering the self-improvement tool as an interactive coach and advisor. The present qualitative interview study (N = 18) explored the user experience of self-improvement technologies. A special focus was on the perceived dialog between tool and user, which we analyzed in terms of models from communication psychology. Our findings show that users are sensible to the way the product "speaks to them" and consider this as essential for their experience and successful change. Analysis revealed different communication styles of self-improvement tools (e.g., helpful-cooperative, rational-distanced, critical-aggressive), each linked to specific emotional consequences. These findings form one starting point for a more psychologically founded design of self-improvement technology. On a more general level, our approach aims to contribute to a better integration of psychological and technological knowledge, and in consequence, supporting users on their way to enhanced well-being.

  10. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  11. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  12. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  13. Design and development of a unit element microstrip antenna for aircraft collision avoidance system

    NASA Astrophysics Data System (ADS)

    De, Debajit; Sahu, Prasanna Kumar

    2017-10-01

    Aircraft/traffic alert and collision avoidance system (ACAS/TCAS) is an airborne system which is designed to provide the service as a last defense equipment for avoiding mid-air collisions between the aircraft. In the existing system, four monopole stub-elements are used as ACAS directional antenna and one blade type element is used as ACAS omnidirectional antenna. The existing ACAS antenna has some drawbacks such as low gain, large beamwidth, frequency and beam tuning/scanning issues etc. Antenna issues like unwanted signals reception may create difficulties to identify the possible threats. In this paper, the focus is on the design and development of a unit element microstrip antenna which can be used for ACAS application and to overcome the possible limitations associated with the existing techniques. Two proposed antenna models are presented here, which are single feed and dual feed microstrip dual patch slotted antenna. These are designed and simulated in CST Microwave Studio tool. The performance and other antenna characteristics have been explored from the simulation results followed by the antenna fabrication and measurement. A good reflection coefficient, Voltage Standing Wave Ratio (VSWR), narrow beamwidth, perfect directional radiation pattern, high gain and directivity make this proposed antenna a good candidate for this application.

  14. Extending the enterprise evolution contextualisation model

    NASA Astrophysics Data System (ADS)

    de Vries, Marné; van der Merwe, Alta; Gerber, Aurona

    2017-07-01

    Enterprise engineering (EE) emerged as a new discipline to encourage comprehensive and consistent enterprise design. Since EE is multidisciplinary, various researchers study enterprises from different perspectives, which resulted in a plethora of applicable literature and terminology, but without shared meaning. Previous research specifically focused on the fragmentation of knowledge for designing and aligning the information and communication technology (ICT) subsystem of the enterprise in order to support the business organisation subsystem of the enterprise. As a solution for this fragmented landscape, a business-IT alignment model (BIAM) was developed inductively from existing business-IT alignment approaches. Since most of the existing alignment frameworks addressed the alignment between the ICT subsystem and the business organisation subsystem, BIAM also focused on the alignment between these two subsystems. Yet, the emerging EE discipline intends to address a broader scope of design, evident in the existing approaches that incorporate a broader scope of design/alignment/governance. A need was identified to address the knowledge fragmentation of the EE knowledge base by adapting BIAM to an enterprise evolution contextualisation model (EECM), to contextualise a broader set of approaches, as identified by Lapalme. The main contribution of this article is the incremental development and evaluation of EECM. We also present guiding indicators/prerequisites for applying EECM as a contextualisation tool.

  15. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship-Observational Studies.

    PubMed

    Snyder, Graham M; Young, Heather; Varman, Meera; Milstone, Aaron M; Harris, Anthony D; Munoz-Price, Silvia

    2016-10-01

    Observational studies compare outcomes among subjects with and without an exposure of interest, without intervention from study investigators. Observational studies can be designed as a prospective or retrospective cohort study or as a case-control study. In healthcare epidemiology, these observational studies often take advantage of existing healthcare databases, making them more cost-effective than clinical trials and allowing analyses of rare outcomes. This paper addresses the importance of selecting a well-defined study population, highlights key considerations for study design, and offers potential solutions including biostatistical tools that are applicable to observational study designs. Infect Control Hosp Epidemiol 2016;1-6.

  16. Preliminary Design Considerations for Access and Operations in Earth-Moon L1/L2 Orbits

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Pavlak, Thomas A.; Haapala, Amanda F.; Howell, Kathleen C.

    2013-01-01

    Within the context of manned spaceflight activities, Earth-Moon libration point orbits could support lunar surface operations and serve as staging areas for future missions to near-Earth asteroids and Mars. This investigation examines preliminary design considerations including Earth-Moon L1/L2 libration point orbit selection, transfers, and stationkeeping costs associated with maintaining a spacecraft in the vicinity of L1 or L2 for a specified duration. Existing tools in multi-body trajectory design, dynamical systems theory, and orbit maintenance are leveraged in this analysis to explore end-to-end concepts for manned missions to Earth-Moon libration points.

  17. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.

  18. Demonstration of a Large-Scale Tank Assembly via Circumferential Friction Stir Welds

    NASA Technical Reports Server (NTRS)

    Jones, Clyde S.; Adams, Glynn; Colligan, Kevin

    2000-01-01

    A collaborative effort between NASA/Marshall Space Flight Center and the Michoud Unit of Lockheed Martin Space Systems Company was undertaken to demonstrate assembly of a large-scale aluminum tank using circumferential friction stir welds. The hardware used to complete this demonstration was fabricated as a study of near-net- shape technologies. The tooling used to complete this demonstration was originally designed for assembly of a tank using fusion weld processes. This presentation describes the modifications and additions that were made to the existing fusion welding tools required to accommodate circumferential friction stir welding, as well as the process used to assemble the tank. The tooling modifications include design, fabrication and installation of several components. The most significant components include a friction stir weld unit with adjustable pin length capabilities, a continuous internal anvil for 'open' circumferential welds, a continuous closeout anvil, clamping systems, an external reaction system and the control system required to conduct the friction stir welds and integrate the operation of the tool. The demonstration was intended as a development task. The experience gained during each circumferential weld was applied to improve subsequent welds. Both constant and tapered thickness 14-foot diameter circumferential welds were successfully demonstrated.

  19. In-flight Evaluation of Aerodynamic Predictions of an Air-launched Space Booster

    NASA Technical Reports Server (NTRS)

    Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan

    1992-01-01

    Several analytical aerodynamic design tools that were applied to the Pegasus (registered trademark) air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which the design margins may be more stringent.

  20. In-flight evaluation of aerodynamic predictions of an air-launched space booster

    NASA Technical Reports Server (NTRS)

    Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan

    1993-01-01

    Several analytical aerodynamic design tools that were applied to the Pegasus air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which design margins may be more stringent.

  1. Examining Menstrual Tracking to Inform the Design of Personal Informatics Tools

    PubMed Central

    Epstein, Daniel A.; Lee, Nicole B.; Kang, Jennifer H.; Agapie, Elena; Schroeder, Jessica; Pina, Laura R.; Fogarty, James; Kientz, Julie A.; Munson, Sean A.

    2017-01-01

    We consider why and how women track their menstrual cycles, examining their experiences to uncover design opportunities and extend the field's understanding of personal informatics tools. To understand menstrual cycle tracking practices, we collected and analyzed data from three sources: 2,000 reviews of popular menstrual tracking apps, a survey of 687 people, and follow-up interviews with 12 survey respondents. We find that women track their menstrual cycle for varied reasons that include remembering and predicting their period as well as informing conversations with healthcare providers. Participants described six methods of tracking their menstrual cycles, including use of technology, awareness of their premenstrual physiological states, and simply remembering. Although women find apps and calendars helpful, these methods are ineffective when predictions of future menstrual cycles are inaccurate. Designs can create feelings of exclusion for gender and sexual minorities. Existing apps also generally fail to consider life stages that women experience, including young adulthood, pregnancy, and menopause. Our findings encourage expanding the field's conceptions of personal informatics. PMID:28516176

  2. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  3. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Application of Weighted Gene Co-expression Network Analysis for Data from Paired Design.

    PubMed

    Li, Jianqiang; Zhou, Doudou; Qiu, Weiliang; Shi, Yuliang; Yang, Ji-Jiang; Chen, Shi; Wang, Qing; Pan, Hui

    2018-01-12

    Investigating how genes jointly affect complex human diseases is important, yet challenging. The network approach (e.g., weighted gene co-expression network analysis (WGCNA)) is a powerful tool. However, genomic data usually contain substantial batch effects, which could mask true genomic signals. Paired design is a powerful tool that can reduce batch effects. However, it is currently unclear how to appropriately apply WGCNA to genomic data from paired design. In this paper, we modified the current WGCNA pipeline to analyse high-throughput genomic data from paired design. We illustrated the modified WGCNA pipeline by analysing the miRNA dataset provided by Shiah et al. (2014), which contains forty oral squamous cell carcinoma (OSCC) specimens and their matched non-tumourous epithelial counterparts. OSCC is the sixth most common cancer worldwide. The modified WGCNA pipeline identified two sets of novel miRNAs associated with OSCC, in addition to the existing miRNAs reported by Shiah et al. (2014). Thus, this work will be of great interest to readers of various scientific disciplines, in particular, genetic and genomic scientists as well as medical scientists working on cancer.

  5. Considerations for Developing Interfaces for Collecting Patient-Reported Outcomes That Allow the Inclusion of Individuals With Disabilities

    PubMed Central

    Harniss, Mark; Amtmann, Dagmar; Cook, Debbie; Johnson, Kurt

    2010-01-01

    PROMIS (Patient-Reported Outcome Measurement Information System) is developing a set of tools for collecting patient reported outcomes, including computerized adaptive testing that can be administered using different modes, such as computers or phones. The user interfaces for these tools will be designed using the principles of universal design to ensure that it is accessible to all users, including those with disabilities. We review the rationale for making health assessment instruments accessible to users with disabilities, briefly review the standards and guidelines that exist to support developers in the creation of user interfaces with accessibility in mind, and describe the usability and accessibility testing PROMIS will conduct with content experts and users with and without disabilities. Finally, we discuss threats to validity and reliability presented by universal design principles. We argue that the social and practical benefits of interfaces designed to include a broad range of potential users, including those with disabilities, seem to outweigh the need for standardization. Suggestions for future research are also included. PMID:17443119

  6. Rethinking the outpatient medication list: increasing patient activation and education while architecting for centralization and improved medication reconciliation.

    PubMed

    Pandolfe, Frank; Wright, Adam; Slack, Warner V; Safran, Charles

    2018-05-17

    Identify barriers impacting the time consuming and error fraught process of medication reconciliation. Design and implement an electronic medication management system where patient and trusted healthcare proxies can participate in establishing and maintaining an inclusive and up-to-date list of medications. A patient-facing electronic medication manager was deployed within an existing research project focused on elder care management funded by the AHRQ, InfoSAGE, allowing patients and patients' proxies the ability to build and maintain an accurate and up-to-date medication list. Free and open-source tools available from the U.S. government were used to embed the tenets of centralization, interoperability, data federation, and patient activation into the design. Using patient-centered design and free, open-source tools, we implemented a web and mobile enabled patient-facing medication manager for complex medication management. Patient and caregiver participation are essential to improve medication safety. Our medication manager is an early step towards a patient-facing medication manager that has been designed with data federation and interoperability in mind.

  7. Design enhancement tools in MSC/NASTRAN

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.

    1984-01-01

    Design sensitivity is the calculation of derivatives of constraint functions with respect to design variables. While a knowledge of these derivatives is useful in its own right, the derivatives are required in many efficient optimization methods. Constraint derivatives are also required in some reanalysis methods. It is shown where the sensitivity coefficients fit into the scheme of a basic organization of an optimization procedure. The analyzer is to be taken as MSC/NASTRAN. The terminator program monitors the termination criteria and ends the optimization procedure when the criteria are satisfied. This program can reside in several plances: in the optimizer itself, in a user written code, or as part of the MSC/EOS (Engineering Operating System) MSC/EOS currently under development. Since several excellent optimization codes exist and since they require such very specialized technical knowledge, the optimizer under the new MSC/EOS is considered to be selected and supplied by the user to meet his specific needs and preferences. The one exception to this is a fully stressed design (FSD) based on simple scaling. The gradients are currently supplied by various design sensitivity options now existing in MSC/NASTRAN's design sensitivity analysis (DSA).

  8. Design and analysis of a magneto-rheological damper for an all terrain vehicle

    NASA Astrophysics Data System (ADS)

    Krishnan Unni, R.; Tamilarasan, N.

    2018-02-01

    A shock absorber design intended to replace the existing conventional shock absorber with a controllable system using a Magneto-rheological damper is introduced for an All Terrain Vehicle (ATV) that was designed for Baja SAE competitions. Suspensions are a vital part of an All Terrain Vehicles as it endures various surfaces and requires utmost attention while designing. COMSOL multi-physics software is used for applications that have coupled physics problems and is a unique tool that is used for the designing and analysis phase of the Magneto-rheological damper for the considered application and the model is optimized based on Taguchi using DOE software. The magneto-rheological damper is designed to maximize the damping force with the measured geometric constraints for the All Terrain Vehicle.

  9. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  10. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  11. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.F. Beesley

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less

  12. Decision making about healthcare-related tests and diagnostic test strategies. Paper 3: a systematic review shows limitations in most tools designed to assess quality and develop recommendations.

    PubMed

    Mustafa, Reem A; Wiercioch, Wojtek; Falavigna, Maicon; Zhang, Yuan; Ivanova, Liudmila; Arevalo-Rodriguez, Ingrid; Cheung, Adrienne; Prediger, Barbara; Ventresca, Matthew; Brozek, Jan; Santesso, Nancy; Bossuyt, Patrick; Garg, Amit X; Lloyd, Nancy; Lelgemann, Monika; Bühler, Diedrich; Schünemann, Holger J

    2017-12-01

    The objective of this study was to identify and describe critical appraisal tools designed for assessing the quality of evidence (QoE) and/or strength of recommendations (SoRs) related to health care-related tests and diagnostic strategies (HCTDSs). We conducted a systematic review to identify tools applied in guidelines, methodological articles, and systematic reviews to assess HCTDS. We screened 5,534 titles and abstracts, 1,004 full-text articles, and abstracted data from 330 references. We identified 29 tools and 14 modifications of existing tools for assessing QoE and SoR. Twenty-three out of 29 tools acknowledge the importance of assessing the QoE and SoR separately, but in 8, the SoR is based solely on QoE. When making decisions about the use of tests, patient values and preferences and impact on resource utilization were considered in 6 and 8 tools, respectively. There is also confusion about the terminology that describes the various factors that influence the QoE and SoR. Although at least one approach includes all relevant criteria for assessing QoE and determining SoR, more detailed guidance about how to operationalize these assessments and make related judgments will be beneficial. There is a need for a better description of the framework for using evidence to make decisions and develop recommendations about HCTDS. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Relating MBSE to Spacecraft Development: A NASA Pathfinder

    NASA Technical Reports Server (NTRS)

    Othon, Bill

    2016-01-01

    The NASA Engineering and Safety Center (NESC) has sponsored a Pathfinder Study to investigate how Model Based Systems Engineering (MBSE) and Model Based Engineering (MBE) techniques can be applied by NASA spacecraft development projects. The objectives of this Pathfinder Study included analyzing both the products of the modeling activity, as well as the process and tool chain through which the spacecraft design activities are executed. Several aspects of MBSE methodology and process were explored. Adoption and consistent use of the MBSE methodology within an existing development environment can be difficult. The Pathfinder Team evaluated the possibility that an "MBSE Template" could be developed as both a teaching tool as well as a baseline from which future NASA projects could leverage. Elements of this template include spacecraft system component libraries, data dictionaries and ontology specifications, as well as software services that do work on the models themselves. The Pathfinder Study also evaluated the tool chain aspects of development. Two chains were considered: 1. The Development tool chain, through which SysML model development was performed and controlled, and 2. The Analysis tool chain, through which both static and dynamic system analysis is performed. Of particular interest was the ability to exchange data between SysML and other engineering tools such as CAD and Dynamic Simulation tools. For this study, the team selected a Mars Lander vehicle as the element to be designed. The paper will discuss what system models were developed, how data was captured and exchanged, and what analyses were conducted.

  14. Direct surgeon control of the computer in the operating room.

    PubMed

    Onceanu, Dumitru; Stewart, A James

    2011-01-01

    This paper describes the design and evaluation of a joystick-like device that allows direct surgeon control of the computer in the operating room. The device contains no electronic parts, is easy to use, is unobtrusive, has no physical connection to the computer, and makes use of an existing surgical tool. The device was tested in comparison to a mouse and to verbal dictation.

  15. The Development and Perpetuation of Professional Learning Communities in Two Elementary Schools: The Role of the Principal and Impact on Teaching and Learning

    ERIC Educational Resources Information Center

    Maynor, Chad Edward

    2010-01-01

    Professional learning communities (PLCs) provide schools with a tool to meet the professional development needs of their teachers through ongoing, job-embedded staff development designed to improve instruction and student learning. While research exists on the development of PLCs, there is a gap in the literature concerning the principal's role in…

  16. Enhancing Fire Control Decision Making with the Patriot Cognitive Skills Trainer: Development and Validation

    DTIC Science & Technology

    2017-11-01

    existing instruction. In addition, the methodology used to identify decision-triggers may be applied to other Army domains to develop instruction...ADDIE is an instructional design framework used as a descriptive guideline for building effective training and performance support tools. 3 In...and evaluate information, and create a solution—were Level Descriptive Terms Additional Examples Create Generating – hypothesizing Planning

  17. Designing healthcare information technology to catalyse change in clinical care.

    PubMed

    Lester, William T; Zai, Adrian H; Grant, Richard W; Chueh, Henry C

    2008-01-01

    The gap between best practice and actual patient care continues to be a pervasive problem in our healthcare system. Efforts to improve on this knowledge-performance gap have included computerised disease management programs designed to improve guideline adherence. However, current computerised reminder and decision support interventions directed at changing physician behaviour have had only a limited and variable effect on clinical outcomes. Further, immediate pay-for-performance financial pressures on institutions have created an environment where disease management systems are often created under duress, appended to existing clinical systems and poorly integrated into the existing workflow, potentially limiting their real-world effectiveness. The authors present a review of disease management as well as a conceptual framework to guide the development of more effective health information technology (HIT) tools for translating clinical information into clinical action.

  18. Invitation to a forum: architecting operational `next generation' earth monitoring satellites based on best modeling, existing sensor capabilities, with constellation efficiencies to secure trusted datasets for the next 20 years

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Bell, Raymond M.; Grant, David A.; Lentz, Christopher A.

    2012-09-01

    Architecting the operational Next Generation of earth monitoring satellites based on matured climate modeling, reuse of existing sensor & satellite capabilities, attention to affordability and evolutionary improvements integrated with constellation efficiencies - becomes our collective goal for an open architectural design forum. Understanding the earth's climate and collecting requisite signatures over the next 30 years is a shared mandate by many of the world's governments. But there remains a daunting challenge to bridge scientific missions to 'operational' systems that truly support the demands of decision makers, scientific investigators and global users' requirements for trusted data. In this paper we will suggest an architectural structure that takes advantage of current earth modeling examples including cross-model verification and a first order set of critical climate parameters and metrics; that in turn, are matched up with existing space borne collection capabilities and sensors. The tools used and the frameworks offered are designed to allow collaborative overlays by other stakeholders nominating different critical parameters and their own treaded connections to existing international collection experience. These aggregate design suggestions will be held up to group review and prioritized as potential constellation solutions including incremental and spiral developments - including cost benefits and organizational opportunities. This Part IV effort is focused on being an inclusive 'Next Gen Constellation' design discussion and is the natural extension to earlier papers.

  19. The Papillomavirus Episteme: a central resource for papillomavirus sequence data and analysis.

    PubMed

    Van Doorslaer, Koenraad; Tan, Qina; Xirasagar, Sandhya; Bandaru, Sandya; Gopalan, Vivek; Mohamoud, Yasmin; Huyen, Yentram; McBride, Alison A

    2013-01-01

    The goal of the Papillomavirus Episteme (PaVE) is to provide an integrated resource for the analysis of papillomavirus (PV) genome sequences and related information. The PaVE is a freely accessible, web-based tool (http://pave.niaid.nih.gov) created around a relational database, which enables storage, analysis and exchange of sequence information. From a design perspective, the PaVE adopts an Open Source software approach and stresses the integration and reuse of existing tools. Reference PV genome sequences have been extracted from publicly available databases and reannotated using a custom-created tool. To date, the PaVE contains 241 annotated PV genomes, 2245 genes and regions, 2004 protein sequences and 47 protein structures, which users can explore, analyze or download. The PaVE provides scientists with the data and tools needed to accelerate scientific progress for the study and treatment of diseases caused by PVs.

  20. Image tools for UNIX

    NASA Technical Reports Server (NTRS)

    Banks, David C.

    1994-01-01

    This talk features two simple and useful tools for digital image processing in the UNIX environment. They are xv and pbmplus. The xv image viewer which runs under the X window system reads images in a number of different file formats and writes them out in different formats. The view area supports a pop-up control panel. The 'algorithms' menu lets you blur an image. The xv control panel also activates the color editor which displays the image's color map (if one exists). The xv image viewer is available through the internet. The pbmplus package is a set of tools designed to perform image processing from within a UNIX shell. The acronym 'pbm' stands for portable bit map. Like xv, the pbm plus tool can convert images from and to many different file formats. The source code and manual pages for pbmplus are also available through the internet. This software is in the public domain.

  1. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  2. Don't Forget the Doctor: Gastroenterologists' Preferences on the Development of mHealth Tools for Inflammatory Bowel Disease.

    PubMed

    van Mierlo, Trevor; Fournier, Rachel; Fedorak, Richard

    2015-01-21

    Inflammatory bowel disease (IBD) encompasses a number of disorders of the gastrointestinal tract. Treatment for IBD is lifelong and complex, and the majority of IBD patients seek information on the Internet. However, research has found existing digital resources to be of questionable quality and that patients find content lacking. Gastroenterologists are frontline sources of information for North American IBD patients, but their opinions and preferences for digital content, design, and utility have not been investigated. The purpose of this study is to systematically explore gastroenterologists' perceptions of, and design preferences for, mHealth tools. Our goal was to critically assess these issues and elicit expert feedback by seeking consensus with Canadian gastroenterologists. Using a qualitative approach, a closed meeting with 7 gastroenterologists was audio recorded and field notes taken. To synthesize results, an anonymous questionnaire was collected at the end of the session. Participant-led discussion themes included methodological approaches to non-adherence, concordance, patient-centricity, and attributes of digital tools that would be actively supported and promoted. Survey results indicated that 4 of the 7 gastroenterologists had experienced patients bringing digital resources to a visit, but 5 found digital patient resources to be inaccurate or irrelevant. All participants agreed that digital tools were of increasing importance and could be leveraged to aid in consultations and save time. When asked to assess digital attributes that they would be confident to refer patients to, all seven indicated that the inclusion of evidence-based facts were of greatest importance. Patient peer-support networks were deemed an asset but only if closely monitored by experts. When asked about interventions, nearly all (6/7) preferred tools that addressed a mix of compliance and concordance, and only one supported the development of tools that focused on compliance. Participants confirmed that they would actively refer patients and other physicians to digital resources. However, while a number of digital IBD tools exist, gastroenterologists would be reluctant to endorse them. Gastroenterologists appear eager to use digital resources that they believe benefit the physician-patient relationship, but despite the trend of patient-centric tools that focus on concordance (shared decision making and enlightened communication between patients and their health care providers), they would prefer digital tools that highlight compliance (patient following orders). This concordance gap highlights an issue of disparity in digital health: patients may not use tools that physicians promote, and physicians may not endorse tools that patients will use. Further research investigating the concordance gap, and tensions between physician preferences and patient needs, is required.

  3. Estimating flood hydrographs and volumes for Alabama streams

    USGS Publications Warehouse

    Olin, D.A.; Atkins, J.B.

    1988-01-01

    The hydraulic design of highway drainage structures involves an evaluation of the effect of the proposed highway structures on lives, property, and stream stability. Flood hydrographs and associated flood volumes are useful tools in evaluating these effects. For design purposes, the Alabama Highway Department needs information on flood hydrographs and volumes associated with flood peaks of specific recurrence intervals (design floods) at proposed or existing bridge crossings. This report will provide the engineer with a method to estimate flood hydrographs, volumes, and lagtimes for rural and urban streams in Alabama with drainage areas less than 500 sq mi. Existing computer programs and methods to estimate flood hydrographs and volumes for ungaged streams have been developed in Georgia. These computer programs and methods were applied to streams in Alabama. The report gives detailed instructions on how to estimate flood hydrographs for ungaged rural or urban streams in Alabama with drainage areas less than 500 sq mi, without significant in-channel storage or regulations. (USGS)

  4. X-ray optics simulation and beamline design for the APS upgrade

    NASA Astrophysics Data System (ADS)

    Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean

    2017-08-01

    The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.

  5. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  6. Addiction Science: Uncovering Neurobiological Complexity

    PubMed Central

    Volkow, N. D.; Baler, R. D.

    2013-01-01

    Until very recently addiction-research was limited by existing tools and strategies that were inadequate for studying the inherent complexity at each of the different phenomenological levels. However, powerful new tools (e.g., optogenetics and designer drug receptors) and high throughput protocols are starting to give researchers the potential to systematically interrogate “all” genes, epigenetic marks, and neuronal circuits. These advances, combined with imaging technologies (both for preclinical and clinical studies) and a paradigm shift towards open access have spurred an unlimited growth of datasets transforming the way we investigate the neurobiology of substance use disorders (SUD) and the factors that modulate risk and resilience. PMID:23688927

  7. Extending software repository hosting to code review and testing

    NASA Astrophysics Data System (ADS)

    Gonzalez Alvarez, A.; Aparicio Cotarelo, B.; Lossent, A.; Andersen, T.; Trzcinska, A.; Asbury, D.; Hłimyr, N.; Meinhard, H.

    2015-12-01

    We will describe how CERN's services around Issue Tracking and Version Control have evolved, and what the plans for the future are. We will describe the services main design, integration and structure, giving special attention to the new requirements from the community of users in terms of collaboration and integration tools and how we address this challenge when defining new services based on GitLab for collaboration to replace our current Gitolite service and Code Review and Jenkins for Continuous Integration. These new services complement the existing ones to create a new global "development tool stack" where each working group can place its particular development work-flow.

  8. BeadArray Expression Analysis Using Bioconductor

    PubMed Central

    Ritchie, Matthew E.; Dunning, Mark J.; Smith, Mike L.; Shi, Wei; Lynch, Andy G.

    2011-01-01

    Illumina whole-genome expression BeadArrays are a popular choice in gene profiling studies. Aside from the vendor-provided software tools for analyzing BeadArray expression data (GenomeStudio/BeadStudio), there exists a comprehensive set of open-source analysis tools in the Bioconductor project, many of which have been tailored to exploit the unique properties of this platform. In this article, we explore a number of these software packages and demonstrate how to perform a complete analysis of BeadArray data in various formats. The key steps of importing data, performing quality assessments, preprocessing, and annotation in the common setting of assessing differential expression in designed experiments will be covered. PMID:22144879

  9. PhytoCRISP-Ex: a web-based and stand-alone application to find specific target sequences for CRISPR/CAS editing.

    PubMed

    Rastogi, Achal; Murik, Omer; Bowler, Chris; Tirichine, Leila

    2016-07-01

    With the emerging interest in phytoplankton research, the need to establish genetic tools for the functional characterization of genes is indispensable. The CRISPR/Cas9 system is now well recognized as an efficient and accurate reverse genetic tool for genome editing. Several computational tools have been published allowing researchers to find candidate target sequences for the engineering of the CRISPR vectors, while searching possible off-targets for the predicted candidates. These tools provide built-in genome databases of common model organisms that are used for CRISPR target prediction. Although their predictions are highly sensitive, the applicability to non-model genomes, most notably protists, makes their design inadequate. This motivated us to design a new CRISPR target finding tool, PhytoCRISP-Ex. Our software offers CRIPSR target predictions using an extended list of phytoplankton genomes and also delivers a user-friendly standalone application that can be used for any genome. The software attempts to integrate, for the first time, most available phytoplankton genomes information and provide a web-based platform for Cas9 target prediction within them with high sensitivity. By offering a standalone version, PhytoCRISP-Ex maintains an independence to be used with any organism and widens its applicability in high throughput pipelines. PhytoCRISP-Ex out pars all the existing tools by computing the availability of restriction sites over the most probable Cas9 cleavage sites, which can be ideal for mutant screens. PhytoCRISP-Ex is a simple, fast and accurate web interface with 13 pre-indexed and presently updating phytoplankton genomes. The software was also designed as a UNIX-based standalone application that allows the user to search for target sequences in the genomes of a variety of other species.

  10. Design of Scalable and Effective Earth Science Collaboration Tool

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.

    2014-12-01

    Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).

  11. The potential application of the blackboard model of problem solving to multidisciplinary design

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1989-01-01

    The potential application of the blackboard model of problem solving to multidisciplinary design is discussed. Multidisciplinary design problems are complex, poorly structured, and lack a predetermined decision path from the initial starting point to the final solution. The final solution is achieved using data from different engineering disciplines. Ideally, for the final solution to be the optimum solution, there must be a significant amount of communication among the different disciplines plus intradisciplinary and interdisciplinary optimization. In reality, this is not what happens in today's sequential approach to multidisciplinary design. Therefore it is highly unlikely that the final solution is the true optimum solution from an interdisciplinary optimization standpoint. A multilevel decomposition approach is suggested as a technique to overcome the problems associated with the sequential approach, but no tool currently exists with which to fully implement this technique. A system based on the blackboard model of problem solving appears to be an ideal tool for implementing this technique because it offers an incremental problem solving approach that requires no a priori determined reasoning path. Thus it has the potential of finding a more optimum solution for the multidisciplinary design problems found in today's aerospace industries.

  12. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.

  13. Bayesian networks for satellite payload testing

    NASA Astrophysics Data System (ADS)

    Przytula, Krzysztof W.; Hagen, Frank; Yung, Kar

    1999-11-01

    Satellite payloads are fast increasing in complexity, resulting in commensurate growth in cost of manufacturing and operation. A need exists for a software tool, which would assist engineers in production and operation of satellite systems. We have designed and implemented a software tool, which performs part of this task. The tool aids a test engineer in debugging satellite payloads during system testing. At this stage of satellite integration and testing both the tested payload and the testing equipment represent complicated systems consisting of a very large number of components and devices. When an error is detected during execution of a test procedure, the tool presents to the engineer a ranked list of potential sources of the error and a list of recommended further tests. The engineer decides this on this basis if to perform some of the recommended additional test or replace the suspect component. The tool has been installed in payload testing facility. The tool is based on Bayesian networks, a graphical method of representing uncertainty in terms of probabilistic influences. The Bayesian network was configured using detailed flow diagrams of testing procedures and block diagrams of the payload and testing hardware. The conditional and prior probability values were initially obtained from experts and refined in later stages of design. The Bayesian network provided a very informative model of the payload and testing equipment and inspired many new ideas regarding the future test procedures and testing equipment configurations. The tool is the first step in developing a family of tools for various phases of satellite integration and operation.

  14. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  15. DEVELOPMENT OF A SOFTWARE DESIGN TOOL FOR HYBRID SOLAR-GEOTHERMAL HEAT PUMP SYSTEMS IN HEATING- AND COOLING-DOMINATED BUILDINGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yavuzturk, C. C.; Chiasson, A. D.; Filburn, T. P.

    This project provides an easy-to-use, menu-driven, software tool for designing hybrid solar-geothermal heat pump systems (GHP) for both heating- and cooling-dominated buildings. No such design tool currently exists. In heating-dominated buildings, the design approach takes advantage of glazed solar collectors to effectively balance the annual thermal loads on the ground with renewable solar energy. In cooling-dominated climates, the design approach takes advantage of relatively low-cost, unglazed solar collectors as the heat rejecting component. The primary benefit of hybrid GHPs is the reduced initial cost of the ground heat exchanger (GHX). Furthermore, solar thermal collectors can be used to balance themore » ground loads over the annual cycle, thus making the GHX fully sustainable; in heating-dominated buildings, the hybrid energy source (i.e., solar) is renewable, in contrast to a typical fossil fuel boiler or electric resistance as the hybrid component; in cooling-dominated buildings, use of unglazed solar collectors as a heat rejecter allows for passive heat rejection, in contrast to a cooling tower that consumes a significant amount of energy to operate, and hybrid GHPs can expand the market by allowing reduced GHX footprint in both heating- and cooling-dominated climates. The design tool allows for the straight-forward design of innovative GHP systems that currently pose a significant design challenge. The project lays the foundations for proper and reliable design of hybrid GHP systems, overcoming a series of difficult and cumbersome steps without the use of a system simulation approach, and without an automated optimization scheme. As new technologies and design concepts emerge, sophisticated design tools and methodologies must accompany them and be made usable for practitioners. Lack of reliable design tools results in reluctance of practitioners to implement more complex systems. A menu-driven software tool for the design of hybrid solar GHP systems is provided that is based on mathematically robust, validated models. An automated optimization tool is used to balance ground loads and incorporated into the simulation engine. With knowledge of the building loads, thermal properties of the ground, the borehole heat exchanger configuration, the heat pump peak hourly and seasonal COP for heating and cooling, the critical heat pump design entering fluid temperature, and the thermal performance of a solar collector, the total GHX length can be calculated along with the area of a supplemental solar collector array and the corresponding reduced GHX length. An economic analysis module allows for the calculation of the lowest capital cost combination of solar collector area and GHX length. ACKNOWLEDGMENTS This project was funded by the United States Department of Energy DOE-DE-FOA-0000116, Recovery Act Geothermal Technologies Program: Ground Source Heat Pumps. The lead contractor, The University of Hartford, was supported by The University of Dayton, and the Oak Ridge National Laboratories. All funding and support for this project as well as contributions of graduate and undergraduate students from the contributing institutions are gratefully acknowledged.« less

  16. PhiSiGns: an online tool to identify signature genes in phages and design PCR primers for examining phage diversity.

    PubMed

    Dwivedi, Bhakti; Schmieder, Robert; Goldsmith, Dawn B; Edwards, Robert A; Breitbart, Mya

    2012-03-04

    Phages (viruses that infect bacteria) have gained significant attention because of their abundance, diversity and important ecological roles. However, the lack of a universal gene shared by all phages presents a challenge for phage identification and characterization, especially in environmental samples where it is difficult to culture phage-host systems. Homologous conserved genes (or "signature genes") present in groups of closely-related phages can be used to explore phage diversity and define evolutionary relationships amongst these phages. Bioinformatic approaches are needed to identify candidate signature genes and design PCR primers to amplify those genes from environmental samples; however, there is currently no existing computational tool that biologists can use for this purpose. Here we present PhiSiGns, a web-based and standalone application that performs a pairwise comparison of each gene present in user-selected phage genomes, identifies signature genes, generates alignments of these genes, and designs potential PCR primer pairs. PhiSiGns is available at (http://www.phantome.org/phisigns/; http://phisigns.sourceforge.net/) with a link to the source code. Here we describe the specifications of PhiSiGns and demonstrate its application with a case study. PhiSiGns provides phage biologists with a user-friendly tool to identify signature genes and design PCR primers to amplify related genes from uncultured phages in environmental samples. This bioinformatics tool will facilitate the development of novel signature genes for use as molecular markers in studies of phage diversity, phylogeny, and evolution.

  17. PhiSiGns: an online tool to identify signature genes in phages and design PCR primers for examining phage diversity

    PubMed Central

    2012-01-01

    Background Phages (viruses that infect bacteria) have gained significant attention because of their abundance, diversity and important ecological roles. However, the lack of a universal gene shared by all phages presents a challenge for phage identification and characterization, especially in environmental samples where it is difficult to culture phage-host systems. Homologous conserved genes (or "signature genes") present in groups of closely-related phages can be used to explore phage diversity and define evolutionary relationships amongst these phages. Bioinformatic approaches are needed to identify candidate signature genes and design PCR primers to amplify those genes from environmental samples; however, there is currently no existing computational tool that biologists can use for this purpose. Results Here we present PhiSiGns, a web-based and standalone application that performs a pairwise comparison of each gene present in user-selected phage genomes, identifies signature genes, generates alignments of these genes, and designs potential PCR primer pairs. PhiSiGns is available at (http://www.phantome.org/phisigns/; http://phisigns.sourceforge.net/) with a link to the source code. Here we describe the specifications of PhiSiGns and demonstrate its application with a case study. Conclusions PhiSiGns provides phage biologists with a user-friendly tool to identify signature genes and design PCR primers to amplify related genes from uncultured phages in environmental samples. This bioinformatics tool will facilitate the development of novel signature genes for use as molecular markers in studies of phage diversity, phylogeny, and evolution. PMID:22385976

  18. Testing the Birth Unit Design Spatial Evaluation Tool (BUDSET) in Australia: a pilot study.

    PubMed

    Foureur, Maralyn J; Leap, Nicky; Davis, Deborah L; Forbes, Ian F; Homer, Caroline E S

    2011-01-01

    To pilot test the Birth Unit Design Spatial Evaluation Tool (BUDSET) in an Australian maternity care setting to determine whether such an instrument can measure the optimality of different birth settings. Optimally designed spaces to give birth are likely to influence a woman's ability to experience physiologically normal labor and birth. This is important in the current industrialized environment, where increased caesarean section rates are causing concerns. The measurement of an optimal birth space is currently impossible, because there are limited tools available. A quantitative study was undertaken to pilot test the discriminant ability of the BUDSET in eight maternity units in New South Wales, Australia. Five auditors trained in the use of the BUDSET assessed the birth units using the BUDSET, which is based on 18 design principles and is divided into four domains (Fear Cascade, Facility, Aesthetics, and Support) with three to eight assessable items in each. Data were independently collected in eight birth units. Values for each of the domains were aggregated to provide an overall Optimality Score for each birth unit. A range of Optimality Scores was derived for each of the birth units (from 51 to 77 out of a possible 100 points). The BUDSET identified units with low-scoring domains. Essentially these were older units and conventional labor ward settings. The BUDSET provides a way to assess the optimality of birth units and determine which domain areas may need improvement. There is potential for improvements to existing birth spaces, and considerable improvement can be made with simple low-cost modifications. Further research is needed to validate the tool.

  19. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  20. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    1998-01-01

    The goal of this project is to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project applies case-based reasoning (CBR) and concept mapping (CMAP) tools to the task of capturing, organizing, and interactively accessing experiences or "cases" encapsulating the methods and rationale underlying expert aerospace design. As stipulated in the award, Indiana University and Ames personnel are collaborating on performance of research and determining the direction of research, to assure that the project focuses on high-value tasks. In the first five months of the project, we have made two visits to Ames Research Center to consult with our NASA collaborators, to learn about the advanced aerospace design tools being developed there, and to identify specific needs for intelligent design support. These meetings identified a number of task areas for applying CBR and concept mapping technology. We jointly selected a first task area to focus on: Acquiring the convergence criteria that experts use to guide the selection of useful data from a set of numerical simulations of high-lift systems. During the first funding period, we developed two software systems. First, we have adapted a CBR system developed at Indiana University into a prototype case-based reasoning shell to capture and retrieve information about design experiences, with the sample task of capturing and reusing experts' intuitive criteria for determining convergence (work conducted at Indiana University). Second, we have also adapted and refined existing concept mapping tools that will be used to clarify and capture the rationale underlying those experiences, to facilitate understanding of the expert's reasoning and guide future reuse of captured information (work conducted at the University of West Florida). The tools we have developed are designed to be the basis for a general framework for facilitating tasks within systems developed by the Advanced Design Technologies Testbed (ADTT) project at ARC. The tenets of our framework are (1) that the systems developed should leverage a designer's knowledge, rather than attempting to replace it; (2) that learning and user feedback must play a central role, so that the system can adapt to how it is used, and (3) that the learning and feedback processes must be as natural and as unobtrusive as possible. In the second funding period we will extend our current work, applying the tools to capturing higher-level design rationale.

  1. Geographic information system-based healthcare waste management planning for treatment site location and optimal transportation routeing.

    PubMed

    Shanmugasundaram, Jothiganesh; Soulalay, Vongdeuane; Chettiyappan, Visvanathan

    2012-06-01

    In Lao People's Democratic Republic (Lao PDR), a growth of healthcare centres, and the environmental hazards and public health risks typically accompanying them, increased the need for healthcare waste (HCW) management planning. An effective planning of an HCW management system including components such as the treatment plant siting and an optimized routeing system for collection and transportation of waste is deemed important. National government offices at developing countries often lack the proper tools and methodologies because of the high costs usually associated with them. However, this study attempts to demonstrate the use of an inexpensive GIS modelling tool for healthcare waste management in the country. Two areas were designed for this study on HCW management, including: (a) locating centralized treatment plants and designing optimum travel routes for waste collection from nearby healthcare facilities; and (b) utilizing existing hospital incinerators and designing optimum routes for collecting waste from nearby healthcare facilities. Spatial analysis paved the way to understand the spatial distribution of healthcare wastes and to identify hotspots of higher waste generating locations. Optimal route models were designed for collecting and transporting HCW to treatment plants, which also highlights constraints in collecting and transporting waste for treatment and disposal. The proposed model can be used as a decision support tool for the efficient management of hospital wastes by government healthcare waste management authorities and hospitals.

  2. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  3. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  4. HiC-bench: comprehensive and reproducible Hi-C data analysis designed for parameter exploration and benchmarking.

    PubMed

    Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis

    2017-01-05

    Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.

  5. Simplified spacecraft vulnerability assessments at component level in early design phase at the European Space Agency's Concurrent Design Facility

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith

    2016-12-01

    During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.

  6. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).

  7. Self-assembly kinetics of microscale components: A parametric evaluation

    NASA Astrophysics Data System (ADS)

    Carballo, Jose M.

    The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.

  8. Collaboration across eight research centers: unanticipated benefits and outcomes for project managers.

    PubMed

    Perez, Norma A; Weathers, Benita; Willis, Marilyn; Mendez, Jacqueline

    2013-02-01

    Managers of transdisciplinary collaborative research lack suitable didactic material to support the implementation of research methodologies and to build ongoing partnerships with community representatives and peers, both between and within multiple academic centers. This article will provide insight on the collaborative efforts of project managers involved in multidisciplinary research and their subsequent development of a tool kit for research project managers and/or directors. Project managers from the 8 Centers for Population Health and Health Disparities across the nation participated in monthly teleconferences to share experiences and offer advice on how to achieve high participation rates and maintain community involvement in collaboration with researchers and community leaders to achieve the common goal of decreasing health inequities. In the process, managers recognized and seized the opportunity to produce a tool kit that was designed for future project managers and directors. Project managers in geographically distinct locations maintained a commitment to work together over 4 years and subsequently built upon an existing communications network to design a tool kit that could be disseminated easily to a diverse audience.

  9. Scaffolding the design of accessible eLearning content: a user-centered approach and cognitive perspective.

    PubMed

    Catarci, Tiziana; De Giovanni, Loredana; Gabrielli, Silvia; Kimani, Stephen; Mirabella, Valeria

    2008-08-01

    There exist various guidelines for facilitating the design, preparation, and deployment of accessible eLearning applications and contents. However, such guidelines prevalently address accessibility in a rather technical sense, without giving sufficient consideration to the cognitive aspects and issues related to the use of eLearning materials by learners with disabilities. In this paper we describe how a user-centered design process was applied to develop a method and set of guidelines for didactical experts to scaffold their creation of accessible eLearning content, based on a more sound approach to accessibility. The paper also discusses possible design solutions for tools supporting eLearning content authors in the adoption and application of the proposed approach.

  10. Design of High Field Solenoids made of High Temperature Superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartalesi, Antonio; /Pisa U.

    2010-12-01

    This thesis starts from the analytical mechanical analysis of a superconducting solenoid, loaded by self generated Lorentz forces. Also, a finite element model is proposed and verified with the analytical results. To study the anisotropic behavior of a coil made by layers of superconductor and insulation, a finite element meso-mechanic model is proposed and designed. The resulting material properties are then used in the main solenoid analysis. In parallel, design work is performed as well: an existing Insert Test Facility (ITF) is adapted and structurally verified to support a coil made of YBa{sub 2}Cu{sub 3}O{sub 7}, a High Temperature Superconductormore » (HTS). Finally, a technological winding process was proposed and the required tooling is designed.« less

  11. Integrating opto-thermo-mechanical design tools: open engineering's project presentation

    NASA Astrophysics Data System (ADS)

    De Vincenzo, P.; Klapka, Igor

    2017-11-01

    An integrated numerical simulation package dedicated to the analysis of the coupled interactions of optical devices is presented. To reduce human interventions during data transfers, it is based on in-memory communications between the structural analysis software OOFELIE and the optical design application ZEMAX. It allows the automated enhancement of the existing optical design with information related to the deformations of optical surfaces due to thermomechanical solicitations. From the knowledge of these deformations, a grid of points or a decomposition based on Zernike polynomials can be generated for each surface. These data are then applied to the optical design. Finally, indicators can be retrieved from ZEMAX in order to compare the optical performances with those of the system in its nominal configuration.

  12. Multidisciplinary Design and Analysis for Commercial Aircraft

    NASA Technical Reports Server (NTRS)

    Cummings, Russell M.; Freeman, H. JoAnne

    1999-01-01

    Multidisciplinary design and analysis (MDA) has become the normal mode of operation within most aerospace companies, but the impact of these changes have largely not been reflected at many universities. On an effort to determine if the emergence of multidisciplinary design concepts should influence engineering curricula, NASA has asked several universities (Virginia Tech, Georgia Tech, Clemson, BYU, and Cal Poly) to investigate the practicality of introducing MDA concepts within their undergraduate curricula. A multidisciplinary team of faculty, students, and industry partners evaluated the aeronautical engineering curriculum at Cal Poly. A variety of ways were found to introduce MDA themes into the curriculum without adding courses or units to the existing program. Both analytic and educational tools for multidisciplinary design of aircraft have been developed and implemented.

  13. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite PCells (modules) which consist of primitive building-block PCells (components). To automatically produce layouts, we built on a construct provided by Luceda called a PlaceAndAutoRoute cell: we create a module component by supplying a list of child cells, and a list of the desired connections between the cells (e.g. the out0 port of a microring is connected to a grating coupler). This functionality allowed us to write algorithms to automatically lay out the components: for instance, by laying out the first component and walking through the list of connections to check to see if the next component is already placed or not. The placement and orientation of the new component is determined by minimizing the length of a connecting waveguide. Our photonic circuits also utilize electrical signals to tune the photonic elements (setting propagation phases or microring resonant frequencies via thermo-optical tuning): the algorithm also routes the contacts for the metal heaters to contact pads at the edge of the circuit being designed where it can be contacted by electrical probes. We are currently validating a test run fabricated over the summer, and will use detailed characterization results to prepare our final design cycle in which we aim to demonstrate complex operational logic circuits containing ~50-100 nonlinear resonators.

  14. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection.

    PubMed

    Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.

  15. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection

    PubMed Central

    Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499

  16. An assessment of separable fluid connector system parameters to perform a connector system design optimization study

    NASA Technical Reports Server (NTRS)

    Prasthofer, W. P.

    1974-01-01

    The key to optimization of design where there are a large number of variables, all of which may not be known precisely, lies in the mathematical tool of dynamic programming developed by Bellman. This methodology can lead to optimized solutions to the design of critical systems in a minimum amount of time, even when there are a great number of acceptable configurations to be considered. To demonstrate the usefulness of dynamic programming, an analytical method is developed for evaluating the relationship among existing numerous connector designs to find the optimum configuration. The data utilized in the study were generated from 900 flanges designed for six subsystems of the S-1B stage of the Saturn 1B space carrier vehicle.

  17. Complex systems in metabolic engineering.

    PubMed

    Winkler, James D; Erickson, Keesha; Choudhury, Alaksh; Halweg-Edwards, Andrea L; Gill, Ryan T

    2015-12-01

    Metabolic engineers manipulate intricate biological networks to build efficient biological machines. The inherent complexity of this task, derived from the extensive and often unknown interconnectivity between and within these networks, often prevents researchers from achieving desired performance. Other fields have developed methods to tackle the issue of complexity for their unique subset of engineering problems, but to date, there has not been extensive and comprehensive examination of how metabolic engineers use existing tools to ameliorate this effect on their own research projects. In this review, we examine how complexity affects engineering at the protein, pathway, and genome levels within an organism, and the tools for handling these issues to achieve high-performing strain designs. Quantitative complexity metrics and their applications to metabolic engineering versus traditional engineering fields are also discussed. We conclude by predicting how metabolic engineering practices may advance in light of an explicit consideration of design complexity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Knowledge-based environment for optical system design

    NASA Astrophysics Data System (ADS)

    Johnson, R. Barry

    1991-01-01

    Optical systems are extensively utilized by industry government and military organizations. The conceptual design engineering design fabrication and testing of these systems presently requires significant time typically on the order of 3-5 years. The Knowledge-Based Environment for Optical System Design (KB-OSD) Program has as its principal objectives the development of a methodology and tool(s) that will make a notable reduction in the development time of optical system projects reduce technical risk and overall cost. KB-OSD can be considered as a computer-based optical design associate for system engineers and design engineers. By utilizing artificial intelligence technology coupled with extensive design/evaluation computer application programs and knowledge bases the KB-OSD will provide the user with assistance and guidance to accomplish such activities as (i) develop system level and hardware level requirements from mission requirements (ii) formulate conceptual designs (iii) construct a statement of work for an RFP (iv) develop engineering level designs (v) evaluate an existing design and (vi) explore the sensitivity of a system to changing scenarios. The KB-OSD comprises a variety of computer platforms including a Stardent Titan supercomputer numerous design programs (lens design coating design thermal materials structural atmospherics etc. ) data bases and heuristic knowledge bases. An important element of the KB-OSD Program is the inclusion of the knowledge of individual experts in various areas of optics and optical system engineering. This knowledge is obtained by KB-OSD knowledge engineers performing

  19. Development of a quality assessment tool for systematic reviews of observational studies (QATSO) of HIV prevalence in men having sex with men and associated risk behaviours

    PubMed Central

    Wong, William CW; Cheung, Catherine SK; Hart, Graham J

    2008-01-01

    Background Systematic reviews based on the critical appraisal of observational and analytic studies on HIV prevalence and risk factors for HIV transmission among men having sex with men are very useful for health care decisions and planning. Such appraisal is particularly difficult, however, as the quality assessment tools available for use with observational and analytic studies are poorly established. Methods We reviewed the existing quality assessment tools for systematic reviews of observational studies and developed a concise quality assessment checklist to help standardise decisions regarding the quality of studies, with careful consideration of issues such as external and internal validity. Results A pilot version of the checklist was developed based on epidemiological principles, reviews of study designs, and existing checklists for the assessment of observational studies. The Quality Assessment Tool for Systematic Reviews of Observational Studies (QATSO) Score consists of five items: External validity (1 item), reporting (2 items), bias (1 item) and confounding factors (1 item). Expert opinions were sought and it was tested on manuscripts that fulfil the inclusion criteria of a systematic review. Like all assessment scales, QATSO may oversimplify and generalise information yet it is inclusive, simple and practical to use, and allows comparability between papers. Conclusion A specific tool that allows researchers to appraise and guide study quality of observational studies is developed and can be modified for similar studies in the future. PMID:19014686

  20. Simple Nutrition Screening Tool for Pediatric Inpatients.

    PubMed

    White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn

    2016-03-01

    Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.

  1. Understanding the Potential Content and Structure of an International Convention on the Human Rights of People with Disabilities: Sample Treaty Provisions Drawn from Existing International Instruments. A Reference Tool.

    ERIC Educational Resources Information Center

    Lord, Janet E.

    This document is designed to prepare advocates in the international disability community for productive participation in the development of international conventions on the human rights of people with disabilities. Knowledge of the standard categories of international law provisions will help participants address issues related to the structure of…

  2. Advanced Concepts Theory Annual Report 1989

    DTIC Science & Technology

    1990-03-29

    kinetic energy to x-ray conversion and are being evaluated using nickel array implosion calculations. iv o Maxwell Laboratory aluminum array implosion...general, we need to evaluate the degree of machine PRS decoupling produced by runaway electrons, and the existence of a corona may be a relevant aspect of...the tools necessary to carry out data analysis and interpretation and (4) promote the design and evaluation of new experiments and new improved loads

  3. Resilient Software Systems

    DTIC Science & Technology

    2015-06-01

    and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create

  4. Candidate R&D Thrusts for the Software Technology Initiative.

    DTIC Science & Technology

    1981-05-01

    computer-aided design and manufacturing efforts provide examples of multiple representations and multiple manipulation modes. R&D difficulties exist in...farfetched, but the potential payoffs are enormous. References Birk, J., and R. Kelley. Research Needed to Advance the State of Knowledge in Robotics . In...and specifica- tion languages would be benefical . This R&D effort may also result in fusion with management tools with which an acquisition manager

  5. IGA: A Simplified Introduction and Implementation Details for Finite Element Users

    NASA Astrophysics Data System (ADS)

    Agrawal, Vishal; Gautam, Sachin S.

    2018-05-01

    Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.

  6. Traceability of patient records usage: barriers and opportunities for improving user interface design and data management.

    PubMed

    Cruz-Correia, Ricardo; Lapão, Luís; Rodrigues, Pedro Pereira

    2011-01-01

    Although IT governance practices (like ITIL, which recommends on the use of audit logs for proper service level management) are being introduced in many Hospitals to cope with increasing levels of information quality and safety requirements, the standard maturity levels of hospital IT departments is still not enough to reach the level of frequent use of audit logs. This paper aims to address the issues related to the existence of AT in patient records, describe the Hospitals scenario and to produce recommendations. Representatives from four hospitals were interviewed regarding the use of AT in their Hospital IS. Very few AT are known to exist in these hospitals (average of 1 per hospital in an estimate of 21 existing IS). CIOs should to be much more concerned with the existence and maintenance of AT. Recommendations include server clock synchronization and using advanced log visualization tools.

  7. Aero-Thermo-Structural Design Optimization of Internally Cooled Turbine Blades

    NASA Technical Reports Server (NTRS)

    Dulikravich, G. S.; Martin, T. J.; Dennis, B. H.; Lee, E.; Han, Z.-X.

    1999-01-01

    A set of robust and computationally affordable inverse shape design and automatic constrained optimization tools have been developed for the improved performance of internally cooled gas turbine blades. The design methods are applicable to the aerodynamics, heat transfer, and thermoelasticity aspects of the turbine blade. Maximum use of the existing proven disciplinary analysis codes is possible with this design approach. Preliminary computational results demonstrate possibilities to design blades with minimized total pressure loss and maximized aerodynamic loading. At the same time, these blades are capable of sustaining significantly higher inlet hot gas temperatures while requiring remarkably lower coolant mass flow rates. These results suggest that it is possible to design internally cooled turbine blades that will cost less to manufacture, will have longer life span, and will perform as good, if not better than, film cooled turbine blades.

  8. Selection of software for mechanical engineering undergraduates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheah, C. T.; Yin, C. S.; Halim, T.

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  9. A summary of NASA/Air Force full scale engine research programs using the F100 engine

    NASA Technical Reports Server (NTRS)

    Deskin, W. J.; Hurrell, H. G.

    1979-01-01

    A full scale engine research (FSER) program conducted with the F100 engine is presented. The program mechanism is described and the F100 test vehicles utilized are illustrated. Technology items were addressed in the areas of swirl augmentation, flutter phenomenon, advanced electronic control logic theory, strain gage technology and distortion sensitivity. The associated test programs are described. The FSER approach utilizes existing state of the art engine hardware to evaluate advanced technology concepts and problem areas. Aerodynamic phenomenon previously not considered by design systems were identified and incorporated into industry design tools.

  10. Electrical coupled Morris-Lecar neurons: From design to pattern analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binczak, S.; Behdad, R.; Rossé, M.

    2016-06-08

    In this study, an experimental electronic neuron based on Morris-Lecar model is presented, able to become an experimental unit tool to study collective association of robust coupled neurons. The circuit design is given according to the ionic currents of this model. A weak coupling of such neurons under Multisim Software can generate clusters based on the boundary conditions of the neurons and their initial conditions. For this study, we work in the region close to the fold bifurcation of limit cycles. In this region two limit cycles exist, one of the cycles is stable and another one is unstable.

  11. Friction-Stir Welding of Large Scale Cryogenic Fuel Tanks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jones, Clyde S., III; Venable, Richard A.

    1998-01-01

    The Marshall Space Flight Center has established a facility for the joining of large-scale aluminum-lithium alloy 2195 cryogenic fuel tanks using the friction-stir welding process. Longitudinal welds, approximately five meters in length, were made possible by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and the spindle travel system will be described in this paper. Process controls and real-time data acquisition will also be described, and were critical elements contributing to successful weld operation.

  12. Modeling Tools Predict Flow in Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "Because rocket engines operate under extreme temperature and pressure, they present a unique challenge to designers who must test and simulate the technology. To this end, CRAFT Tech Inc., of Pipersville, Pennsylvania, won Small Business Innovation Research (SBIR) contracts from Marshall Space Flight Center to develop software to simulate cryogenic fluid flows and related phenomena. CRAFT Tech enhanced its CRUNCH CFD (computational fluid dynamics) software to simulate phenomena in various liquid propulsion components and systems. Today, both government and industry clients in the aerospace, utilities, and petrochemical industries use the software for analyzing existing systems as well as designing new ones."

  13. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  14. Tools for visually exploring biological networks.

    PubMed

    Suderman, Matthew; Hallett, Michael

    2007-10-15

    Many tools exist for visually exploring biological networks including well-known examples such as Cytoscape, VisANT, Pathway Studio and Patika. These systems play a key role in the development of integrative biology, systems biology and integrative bioinformatics. The trend in the development of these tools is to go beyond 'static' representations of cellular state, towards a more dynamic model of cellular processes through the incorporation of gene expression data, subcellular localization information and time-dependent behavior. We provide a comprehensive review of the relative advantages and disadvantages of existing systems with two goals in mind: to aid researchers in efficiently identifying the appropriate existing tools for data visualization; to describe the necessary and realistic goals for the next generation of visualization tools. In view of the first goal, we provide in the Supplementary Material a systematic comparison of more than 35 existing tools in terms of over 25 different features. Supplementary data are available at Bioinformatics online.

  15. Resources for Functional Genomics Studies in Drosophila melanogaster

    PubMed Central

    Mohr, Stephanie E.; Hu, Yanhui; Kim, Kevin; Housden, Benjamin E.; Perrimon, Norbert

    2014-01-01

    Drosophila melanogaster has become a system of choice for functional genomic studies. Many resources, including online databases and software tools, are now available to support design or identification of relevant fly stocks and reagents or analysis and mining of existing functional genomic, transcriptomic, proteomic, etc. datasets. These include large community collections of fly stocks and plasmid clones, “meta” information sites like FlyBase and FlyMine, and an increasing number of more specialized reagents, databases, and online tools. Here, we introduce key resources useful to plan large-scale functional genomics studies in Drosophila and to analyze, integrate, and mine the results of those studies in ways that facilitate identification of highest-confidence results and generation of new hypotheses. We also discuss ways in which existing resources can be used and might be improved and suggest a few areas of future development that would further support large- and small-scale studies in Drosophila and facilitate use of Drosophila information by the research community more generally. PMID:24653003

  16. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    NASA Astrophysics Data System (ADS)

    Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John

    2017-08-01

    We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances are rejected and full-waveform inversion in a space-time grid around a provided hypocentre. A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequency ranges. The method is tested on synthetic and observed data. It is applied on a data set from the Swiss seismic network and the results are compared with the existing high-quality MT catalogue. The software package programmed in Python is designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large pre-existing earthquake catalogues and data sets.

  17. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  18. A supportive architecture for CFD-based design optimisation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.

  19. Applying the cognitive theory of multimedia learning: an analysis of medical animations.

    PubMed

    Yue, Carole; Kim, Jessie; Ogawa, Rikke; Stark, Elena; Kim, Sara

    2013-04-01

    Instructional animations play a prominent role in medical education, but the degree to which these teaching tools follow empirically established learning principles, such as those outlined in the cognitive theory of multimedia learning (CTML), is unknown. These principles provide guidelines for designing animations in a way that promotes optimal cognitive processing and facilitates learning, but the application of these learning principles in current animations has not yet been investigated. A large-scale review of existing educational tools in the context of this theoretical framework is necessary to examine if and how instructional medical animations adhere to these principles and where improvements can be made. We conducted a comprehensive review of instructional animations in the health sciences domain and examined whether these animations met the three main goals of CTML: managing essential processing; minimising extraneous processing, and facilitating generative processing. We also identified areas for pedagogical improvement. Through Google keyword searches, we identified 4455 medical animations for review. After the application of exclusion criteria, 860 animations from 20 developers were retained. We randomly sampled and reviewed 50% of the identified animations. Many animations did not follow the recommended multimedia learning principles, particularly those that support the management of essential processing. We also noted an excess of extraneous visual and auditory elements and few opportunities for learner interactivity. Many unrealised opportunities exist for improving the efficacy of animations as learning tools in medical education; instructors can look to effective examples to select or design animations that incorporate the established principles of CTML. © Blackwell Publishing Ltd 2013.

  20. The COA360: a tool for assessing the cultural competency of healthcare organizations.

    PubMed

    LaVeist, Thomas A; Relosa, Rachel; Sawaya, Nadia

    2008-01-01

    The U.S. Census Bureau projects that by 2050, non-Hispanic whites will be in the numerical minority. This rapid diversification requires healthcare organizations to pay closer attention to cross-cultural issues if they are to meet the healthcare needs of the nation and continue to maintain a high standard of care. Although scorecards and benchmarking are widely used to gauge healthcare organizations' performance in various areas, these tools have been underused in relation to cultural preparedness or initiatives. The likely reason for this is the lack of a validated tool specifically designed to examine cultural competency. Existing validated cultural competency instruments evaluate individuals, not organizations. In this article, we discuss a study to validate the Cultural Competency Organizational Assessment--360 or the COA360, an instrument designed to appraise a healthcare organization's cultural competence. The Office of Minority Health and the Joint Commission have each developed standards for measuring the cultural competency of organizations. The COA360 is designed to assess adherence to both of these sets of standards. For this validation study, we enlisted a panel of national experts. The panel rated each dimension of the COA360, and the combination of items for each of the scale's 14 dimensions was rated above 4.13 (on 5-point scale). Our conclusion points to the validity of the COA360. As such, it is a valuable tool not only for assessing a healthcare organization's cultural readiness but also for benchmarking its progress in addressing cultural and diversity issues.

  1. Tools for assessing risk of reporting biases in studies and syntheses of studies: a systematic review.

    PubMed

    Page, Matthew J; McKenzie, Joanne E; Higgins, Julian P T

    2018-03-14

    Several scales, checklists and domain-based tools for assessing risk of reporting biases exist, but it is unclear how much they vary in content and guidance. We conducted a systematic review of the content and measurement properties of such tools. We searched for potentially relevant articles in Ovid MEDLINE, Ovid Embase, Ovid PsycINFO and Google Scholar from inception to February 2017. One author screened all titles, abstracts and full text articles, and collected data on tool characteristics. We identified 18 tools that include an assessment of the risk of reporting bias. Tools varied in regard to the type of reporting bias assessed (eg, bias due to selective publication, bias due to selective non-reporting), and the level of assessment (eg, for the study as a whole, a particular result within a study or a particular synthesis of studies). Various criteria are used across tools to designate a synthesis as being at 'high' risk of bias due to selective publication (eg, evidence of funnel plot asymmetry, use of non-comprehensive searches). However, the relative weight assigned to each criterion in the overall judgement is unclear for most of these tools. Tools for assessing risk of bias due to selective non-reporting guide users to assess a study, or an outcome within a study, as 'high' risk of bias if no results are reported for an outcome. However, assessing the corresponding risk of bias in a synthesis that is missing the non-reported outcomes is outside the scope of most of these tools. Inter-rater agreement estimates were available for five tools. There are several limitations of existing tools for assessing risk of reporting biases, in terms of their scope, guidance for reaching risk of bias judgements and measurement properties. Development and evaluation of a new, comprehensive tool could help overcome present limitations. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Conducting systematic reviews of economic evaluations.

    PubMed

    Gomersall, Judith Streak; Jadotte, Yuri Tertilus; Xue, Yifan; Lockwood, Suzi; Riddle, Dru; Preda, Alin

    2015-09-01

    In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. The objective is to present the outcomes of the working group. The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.

  3. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  4. Data base architecture for instrument characteristics critical to spacecraft conceptual design

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Allen, Cheryl L.

    1990-01-01

    Spacecraft designs are driven by the payloads and mission requirements that they support. Many of the payload characteristics, such as mass, power requirements, communication requirements, moving parts, and so forth directly affect the choices for the spacecraft structural configuration and its subsystem design and component selection. The conceptual design process, which translates mission requirements into early spacecraft concepts, must be tolerant of frequent changes in the payload complement and resource requirements. A computer data base was designed and implemented for the purposes of containing the payload characteristics pertinent for spacecraft conceptual design, tracking the evolution of these payloads over time, and enabling the integration of the payload data with engineering analysis programs for improving the efficiency in producing spacecraft designs. In-house tools were used for constructing the data base and for performing the actual integration with an existing program for optimizing payload mass locations on the spacecraft.

  5. BGFit: management and automated fitting of biological growth curves.

    PubMed

    Veríssimo, André; Paixão, Laura; Neves, Ana Rute; Vinga, Susana

    2013-09-25

    Existing tools to model cell growth curves do not offer a flexible integrative approach to manage large datasets and automatically estimate parameters. Due to the increase of experimental time-series from microbiology and oncology, the need for a software that allows researchers to easily organize experimental data and simultaneously extract relevant parameters in an efficient way is crucial. BGFit provides a web-based unified platform, where a rich set of dynamic models can be fitted to experimental time-series data, further allowing to efficiently manage the results in a structured and hierarchical way. The data managing system allows to organize projects, experiments and measurements data and also to define teams with different editing and viewing permission. Several dynamic and algebraic models are already implemented, such as polynomial regression, Gompertz, Baranyi, Logistic and Live Cell Fraction models and the user can add easily new models thus expanding current ones. BGFit allows users to easily manage their data and models in an integrated way, even if they are not familiar with databases or existing computational tools for parameter estimation. BGFit is designed with a flexible architecture that focus on extensibility and leverages free software with existing tools and methods, allowing to compare and evaluate different data modeling techniques. The application is described in the context of bacterial and tumor cells growth data fitting, but it is also applicable to any type of two-dimensional data, e.g. physical chemistry and macroeconomic time series, being fully scalable to high number of projects, data and model complexity.

  6. Assessing the health workforce implications of health policy and programming: how a review of grey literature informed the development of a new impact assessment tool.

    PubMed

    Nove, Andrea; Cometto, Giorgio; Campbell, James

    2017-11-09

    In their adoption of WHA resolution 69.19, World Health Organization Member States requested all bilateral and multilateral initiatives to conduct impact assessments of their funding to human resources for health. The High-Level Commission for Health Employment and Economic Growth similarly proposed that official development assistance for health, education, employment and gender are best aligned to creating decent jobs in the health and social workforce. No standard tools exist for assessing the impact of global health initiatives on the health workforce, but tools exist from other fields. The objectives of this paper are to describe how a review of grey literature informed the development of a draft health workforce impact assessment tool and to introduce the tool. A search of grey literature yielded 72 examples of impact assessment tools and guidance from a wide variety of fields including gender, health and human rights. These examples were reviewed, and information relevant to the development of a health workforce impact assessment was extracted from them using an inductive process. A number of good practice principles were identified from the review. These informed the development of a draft health workforce impact assessment tool, based on an established health labour market framework. The tool is designed to be applied before implementation. It consists of a relatively short and focused screening module to be applied to all relevant initiatives, followed by a more in-depth assessment to be applied only to initiatives for which the screening module indicates that significant implications for HRH are anticipated. It thus aims to strike a balance between maximising rigour and minimising administrative burden. The application of the new tool will help to ensure that health workforce implications are incorporated into global health decision-making processes from the outset and to enhance positive HRH impacts and avoid, minimise or offset negative impacts.

  7. The development of a plant risk evaluation (PRE) tool for assessing the invasive potential of ornamental plants.

    PubMed

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.

  8. The Development of a Plant Risk Evaluation (PRE) Tool for Assessing the Invasive Potential of Ornamental Plants

    PubMed Central

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830

  9. Scientific Platform as a Service - Tools and solutions for efficient access to and analysis of oceanographic data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hansen, Morten W.; Korosov, Anton

    2017-04-01

    Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).

  10. Information Technology: A Tool to Cut Health Care Costs

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.

    1996-01-01

    Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.

  11. Computer based guidance in the modern operating room: a historical perspective.

    PubMed

    Bova, Frank

    2010-01-01

    The past few decades have seen the introduction of many different and innovative approaches aimed at enhancing surgical technique. As microprocessors have decreased in size and increased in processing power, more sophisticated systems have been developed. Some systems have attempted to provide enhanced instrument control while others have attempted to provide tools for surgical guidance. These systems include robotics, image enhancements, and frame-based and frameless guidance procedures. In almost every case the system's design goals were achieved and surgical outcomes were enhanced, yet a vast majority of today's surgical procedures are conducted without the aid of these advances. As new tools are developed and existing tools refined, special attention to the systems interface and integration into the operating room environment will be required before increased utilization of these technologies can be realized.

  12. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  13. Smart Markets for Transferable Pumping Rights

    NASA Astrophysics Data System (ADS)

    Brozovic, N.; Young, R.

    2016-12-01

    While no national policy on groundwater use exists in the United States, local groundwater management is emerging across the country in response to concerns and conflicts over declining well yields, land subsidence, and the depletion of hydrologically connected surface waters. Management strategies include well drilling moratoria, pumping restrictions, and restrictions on the expansion of irrigated land. To provide flexibility to groundwater users, local regulatory authorities increasingly have begun to allow the transfer of groundwater rights as a cost-effective management tool. Markets can be a versatile risk management tool, helping communities to cope with scarcity, to meet goals for sustainability, and to grow resilient local economies. For example, active groundwater rights transfers exist in the High Plains region of the United States. Yet, several barriers to trade exist: high search costs for interested parties, complicated requirements for regulatory compliance, and reluctance to share sensitive financial information. Additionally, groundwater pumping leads to several kinds of spatial and intertemporal externalities such as stream depletion. Indeed, groundwater management schemes that reallocate water between alternate pumping locations are often explicitly designed to change the distribution and magnitude of pumping externalities. Reallocation may be designed to minimize unwanted impacts on third parties or to encourage trades that reduce the magnitude of externalities. We discuss how smart markets can deal with complex biophysical constraints while also encouraging active trading, therefore ensuring local goals for aquifer sustainability while growing local economies. Smart markets address these issues by providing a centralized hub for trading, automating the process of regulatory compliance by only matching buyers and sellers eligible to trade as specified in the regulations, and maintaining anonymous, confidential bidding.

  14. Sandia Advanced MEMS Design Tools, Version 2.2.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor; Allen, James; Lantz, Jeffery

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication processmore » b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ameen, Raed Fawzi Mohammed, E-mail: MohammedAmeenRF@cardiff.ac.uk; Department of Civil Engineering, College of Engineering, University of Karbala; Mourshed, Monjur, E-mail: MourshedM@cardiff.ac.uk

    Cities are responsible for the depletion of natural resources and agricultural lands, and 70% of global CO{sub 2} emissions. There are significant risks to cities from the impacts of climate change in addition to existing vulnerabilities, primarily because of rapid urbanization. Urban design and development are generally considered as the instrument to shape the future of the city and they determine the pattern of a city's resource usage and resilience to change, from climate or otherwise. Cities are inherently dynamic and require the participation and engagement of their diverse stakeholders for the effective management of change, which enables wider stakeholdermore » involvement and buy-in at various stages of the development process. Sustainability assessment of urban design and development is increasingly being seen as indispensable for informed decision-making. A sustainability assessment tool also acts as a driver for the uptake of sustainable pathways by recognizing excellence through their rating system and by creating a market demand for sustainable products and processes. This research reviews six widely used sustainability assessment tools for urban design and development: BREEAM Communities, LEED-ND, CASBEE-UD, SBTool{sup PT}–UP, Pearl Community Rating System (PCRS) and GSAS/QSAS, to identify, compare and contrast the aim, structure, assessment methodology, scoring, weighting and suitability for application in different geographical contexts. Strengths and weaknesses of each tool are critically discussed. The study highlights the disparity in local and international contexts for global sustainability assessment tools. Despite their similarities in aim on environmental aspects, differences exist in the relative importance and share of mandatory vs optional indicators in both environmental and social dimensions. PCRS and GSAS/QSAS are new incarnations, but have widely varying shares of mandatory indicators, at 45.4% and 11.36% respectively, compared to 30% in BREEAM Community. Considerations of economic and cultural aspects are only marginal in the reviewed sustainability assessment tools. However, the newly developed sustainability assessment tools such as GSAS/QSAS and PCRS diverge from their predecessors in their consideration of cultural aspects. - Highlights: • Reviews six urban sustainability assessment methods: LEED-ND, BREEAM Communities, CASBEE-UD, SBTool{sup PT}-UP, PCRS, GSAS/QSAS. • Reviewed methods are biased more towards the environmental, followed by social and economic dimensions of sustainability. • Water issues are highlighted in the Middle East but natural hazards are emphasized only in CASBEE and BREEAM Communities. • SBTool{sup PT}-UP, the most recent of the groups puts more weight (7.32%) on cultural aspects. • Share of mandatory indicators is highest (45.4%) in the Pearl Community Rating System (PCRS)« less

  16. Biomimetics: process, tools and practice.

    PubMed

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haapio, Appu, E-mail: appu.haapio@vtt.fi

    Requirements for the assessment tools of buildings have increased, assessing of building components or separate buildings is not enough. Neighbourhoods, built environment, public transportations, and services, should be considered simultaneously. Number of population living in urban areas is high and increasing rapidly. Urbanisation is a major concern due to its detrimental effects on the environment. The aim of this study is to clarify the field of assessment tools for urban communities by analysing the current situation. The focus is on internationally well known assessment tools; BREEAM Communities, CASBEE for Urban Development and LEED for Neigborhood Development. The interest towards certificationmore » systems is increasing amongst the authorities, and especially amongst the global investors and property developers. Achieved certifications are expected to bring measureable publicity for the developers. The assessment of urban areas enables the comparison of municipalities and urban areas, and notably supports decision making processes. Authorities, city planners, and designers would benefit most from the use of the tools during the decision making process. - Highlights: Black-Right-Pointing-Pointer The urban assessment tools have strong linkage to the region. Black-Right-Pointing-Pointer The tools promote complementary building and retrofitting existing sites. Black-Right-Pointing-Pointer Sharing knowledge and experiences is important in the development of the tools.« less

  18. FT-NIR: A Tool for Process Monitoring and More.

    PubMed

    Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban

    2018-03-30

    With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.

  19. ARC-2007-ACD07-0140-001

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research center, Moffett Field, California, led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. The gyroscopes are flywheels that control the station's attitude without the use of propellant fuel. NASA computer scientists designed the new software, the Inductive Monitoring System, to detect warning signs that precede a gyroscope's failure. According to NASA officials, engineers will add the new software tool to a group of existing tools to identify and track problems related to the gyroscopes. If the software detects warning signs, it will quickly warn the space station's mission control center.

  20. The Cultural Ecogram: A Tool for Enhancing Culturally Anchored Shared Understanding in the Treatment of Ethnic Minority Families

    PubMed Central

    YASUI, MIWA

    2015-01-01

    Ethnic and racial disparities in mental health care continue to exist, highlighting the increasing concern within the realm of clinical practice as to how clinicians are to effectively integrate the central role of culture and context into the treatment delivery process for culturally diverse children and families. The current paper presents the Cultural Ecogram, - a clinical engagement tool designed to facilitate the development of a culturally anchored shared understanding – as one method that may facilitate clinician-client shared understanding on the client’s cultural, ethnic and racial context central to the effective implementation of treatments with ethnic minority children and families. PMID:26273233

  1. JUST in time health emergency interventions: an innovative approach to training the citizen for emergency situations using virtual reality techniques and advanced IT tools (the VR Tool).

    PubMed

    Manganas, A; Tsiknakis, M; Leisch, E; Ponder, M; Molet, T; Herbelin, B; Magnetat-Thalmann, N; Thalmann, D; Fato, M; Schenone, A

    2004-01-01

    This paper reports the results of the second of the two systems developed by JUST, a collaborative project supported by the European Union under the Information Society Technologies (IST) Programme. The most innovative content of the project has been the design and development of a complementary training course for non-professional health emergency operators, which supports the traditional learning phase, and which purports to improve the retention capability of the trainees. This was achieved with the use of advanced information technology techniques, which provide adequate support and can help to overcome the present weaknesses of the existing training mechanisms.

  2. New approaches for real time decision support systems

    NASA Technical Reports Server (NTRS)

    Hair, D. Charles; Pickslay, Kent

    1994-01-01

    NCCOSC RDT&E Division (NRaD) is conducting research into ways of improving decision support systems (DSS) that are used in tactical Navy decision making situations. The research has focused on the incorporation of findings about naturalistic decision-making processes into the design of the DSS. As part of that research, two computer tools were developed that model the two primary naturalistic decision-making strategies used by Navy experts in tactical settings. Current work is exploring how best to incorporate the information produced by those tools into an existing simulation of current Navy decision support systems. This work has implications for any applications involving the need to make decisions under time constraints, based on incomplete or ambiguous data.

  3. ProtaBank: A repository for protein design and engineering data.

    PubMed

    Wang, Connie Y; Chang, Paul M; Ary, Marie L; Allen, Benjamin D; Chica, Roberto A; Mayo, Stephen L; Olafson, Barry D

    2018-03-25

    We present ProtaBank, a repository for storing, querying, analyzing, and sharing protein design and engineering data in an actively maintained and updated database. ProtaBank provides a format to describe and compare all types of protein mutational data, spanning a wide range of properties and techniques. It features a user-friendly web interface and programming layer that streamlines data deposition and allows for batch input and queries. The database schema design incorporates a standard format for reporting protein sequences and experimental data that facilitates comparison of results across different data sets. A suite of analysis and visualization tools are provided to facilitate discovery, to guide future designs, and to benchmark and train new predictive tools and algorithms. ProtaBank will provide a valuable resource to the protein engineering community by storing and safeguarding newly generated data, allowing for fast searching and identification of relevant data from the existing literature, and exploring correlations between disparate data sets. ProtaBank invites researchers to contribute data to the database to make it accessible for search and analysis. ProtaBank is available at https://protabank.org. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  4. Combat Stories Map: A Historical Repository and After Action Tool for Capturing, Storing, and Analyzing Georeferenced Individual Combat Narratives

    DTIC Science & Technology

    2016-06-01

    of technology and near-global Internet accessibility, a web -based program incorporating interactive maps to record personal combat experiences does...not exist. The Combat Stories Map addresses this deficiency. The Combat Stories Map is a web -based Geographic Information System specifically designed...iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Despite the proliferation of technology and near-global Internet accessibility, a web

  5. The process of co-creating the interface for VENSTER, an interactive artwork for nursing home residents with dementia.

    PubMed

    Jamin, Gaston; Luyten, Tom; Delsing, Rob; Braun, Susy

    2017-10-17

    Interactive art installations might engage nursing home residents with dementia. The main aim of this article was to describe the challenging design process of an interactive artwork for nursing home residents, in co-creation with all stakeholders and to share the used methods and lessons learned. This process is illustrated by the design of the interface of VENSTER as a case. Nursing home residents from the psychogeriatric ward, informal caregivers, client representatives, health care professionals and members of the management team were involved in the design process, which consisted of three phases: (1) identify requirements, (2) develop a prototype and (3) conduct usability tests. Several methods were used (e.g. guided co-creation sessions, "Wizard of Oz"). Each phase generated "lessons learned", which were used as the departure point of the next phase. Participants hardly paid attention to the installation and interface. There, however, seemed to be an untapped potential for creating an immersive experience by focussing more on the content itself as an interface (e.g. creating specific scenes with cues for interaction, scenes based on existing knowledge or prior experiences). "Fifteen lessons learned" which can potentially assist the design of an interactive artwork for nursing home residents suffering from dementia were derived from the design process. This description provides tools and best practices for stakeholders to make (better) informed choices during the creation of interactive artworks. It also illustrates how co-design can make the difference between designing a pleasurable experience and a meaningful one. Implications for rehabilitation Co-design with all stakeholders can make the difference between designing a pleasurable experience and a meaningful one. There seems to be an untapped potential for creating an immersive experience by focussing more on the content itself as an interface (e.g. creating specific scenes with cues for interaction, scenes based on existing knowledge or prior experiences). Content as an interface proved to be a crucial part of the overall user experience. The case-study provides tools and best practices (15 "lessons learned") for stakeholders to make (better) informed choices during the creation of interactive artworks.

  6. The effects of design details on cost and weight of fuselage structures

    NASA Technical Reports Server (NTRS)

    Swanson, G. D.; Metschan, S. L.; Morris, M. R.; Kassapoglou, C.

    1993-01-01

    Crown panel design studies showing the relationship between panel size, cost, weight, and aircraft configuration are compared to aluminum design configurations. The effects of a stiffened sandwich design concept are also discussed. This paper summarizes the effect of a design cost model in assessing the cost and weight relationships for fuselage crown panel designs. Studies were performed using data from existing aircraft to assess the effects of different design variables on the cost and weight of transport fuselage crown panel design. Results show a strong influence of load levels, panel size, and material choices on the cost and weight of specific designs. A design tool being developed under the NASA ACT program is used in the study to assess these issues. The effects of panel configuration comparing postbuckled and buckle resistant stiffened laminated structure is compared to a stiffened sandwich concept. Results suggest some potential economy with stiffened sandwich designs for compression dominated structure with relatively high load levels.

  7. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  8. Development of FAST.Farm: A New Multiphysics Engineering Tool for Wind Farm Design and Analysis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, Jason; Annoni, Jennifer; Hayman, Greg

    2017-01-01

    This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevantmore » physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.« less

  9. Earth Observations, Models and Geo-Design in Support of SDG Implementation and Monitoring

    NASA Astrophysics Data System (ADS)

    Plag, H. P.; Jules-Plag, S.

    2016-12-01

    Implementation and Monitoring of the United Nations' Sustainable Development Goals (SDGs) requires support from Earth observation and scientific communities. Applying a goal-based approach to determine the data needs to the Targets and Indicators associated with the SDGs demonstrates that integration of environmental with socio-economic and statistical data is required. Large data gaps exist for the built environment. A Geo-Design platform can provide the infrastructure and conceptual model for the data integration. The development of policies and actions to foster the implementation of SDGs in many cases requires research and the development of tools to answer "what if" questions. Here, agent-based models and model webs combined with a Geo-Design platform are promising avenues. This advanced combined infrastructure can also play a crucial role in the necessary capacity building. We will use the example of SDG 5 (Gender equality) to illustrate these approaches. SDG 11 (Sustainable Cities and Communities) is used to underline the cross-goal linkages and the joint benefits of Earth observations, data integration, and modeling tools for multiple SDGs.

  10. Combining Domain-driven Design and Mashups for Service Development

    NASA Astrophysics Data System (ADS)

    Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni

    This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.

  11. Assessing ergonomic risks of software: Development of the SEAT.

    PubMed

    Peres, S Camille; Mehta, Ranjana K; Ritchey, Paul

    2017-03-01

    Software utilizing interaction designs that require extensive dragging or clicking of icons may increase users' risks for upper extremity cumulative trauma disorders. The purpose of this research is to develop a Self-report Ergonomic Assessment Tool (SEAT) for assessing the risks of software interaction designs and facilitate mitigation of those risks. A 28-item self-report measure was developed by combining and modifying items from existing industrial ergonomic tools. Data were collected from 166 participants after they completed four different tasks that varied by method of input (touch or keyboard and mouse) and type of task (selecting or typing). Principal component analysis found distinct factors associated with stress (i.e., demands) and strain (i.e., response). Repeated measures analyses of variance showed that participants could discriminate the different strain induced by the input methods and tasks. However, participants' ability to discriminate between the stressors associated with that strain was mixed. Further validation of the SEAT is necessary but these results indicate that the SEAT may be a viable method of assessing ergonomics risks presented by software design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments

    PubMed Central

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-01-01

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272

  13. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  14. Fast vaccine design and development based on correlates of protection (COPs)

    PubMed Central

    van Els, Cécile; Mjaaland, Siri; Næss, Lisbeth; Sarkadi, Julia; Gonczol, Eva; Smith Korsholm, Karen; Hansen, Jon; de Jonge, Jørgen; Kersten, Gideon; Warner, Jennifer; Semper, Amanda; Kruiswijk, Corine; Oftung, Fredrik

    2014-01-01

    New and reemerging infectious diseases call for innovative and efficient control strategies of which fast vaccine design and development represent an important element. In emergency situations, when time is limited, identification and use of correlates of protection (COPs) may play a key role as a strategic tool for accelerated vaccine design, testing, and licensure. We propose that general rules for COP-based vaccine design can be extracted from the existing knowledge of protective immune responses against a large spectrum of relevant viral and bacterial pathogens. Herein, we focus on the applicability of this approach by reviewing the established and up-coming COPs for influenza in the context of traditional and a wide array of new vaccine concepts. The lessons learnt from this field may be applied more generally to COP-based accelerated vaccine design for emerging infections. PMID:25424803

  15. Virtual Teams and Human Work Interaction Design - Learning to Work in and Designing for Virtual Teams

    NASA Astrophysics Data System (ADS)

    Orngreen, Rikke; Clemmensen, Torkil; Pejtersen, Annelise Mark

    The boundaries and work processes for how virtual teams interact are undergoing changes, from a tool and stand-alone application orientation, to the use of multiple generic platforms chosen and redesigned to the specific context. These are often at the same time designed both by professional software developers and the individual members of the virtual teams, rather than determined on a single organizational level. There may be no impact of the technology per se on individuals, groups or organizations, as the technology for virtual teams rather enhance situation ambiguity and disrupt existing task-artifact cycles. This ambiguous situation calls for new methods for empirical work analysis and interaction design that can help us understand how organizations, teams and individuals learn to organize, design and work in virtual teams in various networked contexts.

  16. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl

    These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less

  18. Using implementation tools to design and conduct quality improvement projects for faster and more effective improvement.

    PubMed

    Ovretveit, John; Mittman, Brian; Rubenstein, Lisa; Ganz, David A

    2017-10-09

    Purpose The purpose of this paper is to enable improvers to use recent knowledge from implementation science to carry out improvement changes more effectively. It also highlights the importance of converting research findings into practical tools and guidance for improvers so as to make research easier to apply in practice. Design/methodology/approach This study provides an illustration of how a quality improvement (QI) team project can make use of recent findings from implementation research so as to make their improvement changes more effective and sustainable. The guidance is based on a review and synthesis of improvement and implementation methods. Findings The paper illustrates how research can help a quality project team in the phases of problem definition and preparation, in design and planning, in implementation, and in sustaining and spreading a QI. Examples of the use of different ideas and methods are cited where they exist. Research limitations/implications The example is illustrative and there is little limited experimental evidence of whether using all the steps and tools in the one approach proposed do enable a quality team to be more effective. Evidence supporting individual guidance proposals is cited where it exists. Practical implications If the steps proposed and illustrated in the paper were followed, it is possible that quality projects could avoid waste by ensuring the conditions they need for success are in place, and sustain and spread improvement changes more effectively. Social implications More patients could benefit more quickly from more effective implementation of proven interventions. Originality/value The paper is the first to describe how improvement and implementation science can be combined in a tangible way that practical improvers can use in their projects. It shows how QI project teams can take advantage of recent advances in improvement and implementation science to make their work more effective and sustainable.

  19. Don’t Forget the Doctor: Gastroenterologists’ Preferences on the Development of mHealth Tools for Inflammatory Bowel Disease

    PubMed Central

    2015-01-01

    Background Inflammatory bowel disease (IBD) encompasses a number of disorders of the gastrointestinal tract. Treatment for IBD is lifelong and complex, and the majority of IBD patients seek information on the Internet. However, research has found existing digital resources to be of questionable quality and that patients find content lacking. Gastroenterologists are frontline sources of information for North American IBD patients, but their opinions and preferences for digital content, design, and utility have not been investigated. The purpose of this study is to systematically explore gastroenterologists’ perceptions of, and design preferences for, mHealth tools. Objective Our goal was to critically assess these issues and elicit expert feedback by seeking consensus with Canadian gastroenterologists. Methods Using a qualitative approach, a closed meeting with 7 gastroenterologists was audio recorded and field notes taken. To synthesize results, an anonymous questionnaire was collected at the end of the session. Participant-led discussion themes included methodological approaches to non-adherence, concordance, patient-centricity, and attributes of digital tools that would be actively supported and promoted. Results Survey results indicated that 4 of the 7 gastroenterologists had experienced patients bringing digital resources to a visit, but 5 found digital patient resources to be inaccurate or irrelevant. All participants agreed that digital tools were of increasing importance and could be leveraged to aid in consultations and save time. When asked to assess digital attributes that they would be confident to refer patients to, all seven indicated that the inclusion of evidence-based facts were of greatest importance. Patient peer-support networks were deemed an asset but only if closely monitored by experts. When asked about interventions, nearly all (6/7) preferred tools that addressed a mix of compliance and concordance, and only one supported the development of tools that focused on compliance. Participants confirmed that they would actively refer patients and other physicians to digital resources. However, while a number of digital IBD tools exist, gastroenterologists would be reluctant to endorse them. Conclusions Gastroenterologists appear eager to use digital resources that they believe benefit the physician-patient relationship, but despite the trend of patient-centric tools that focus on concordance (shared decision making and enlightened communication between patients and their health care providers), they would prefer digital tools that highlight compliance (patient following orders). This concordance gap highlights an issue of disparity in digital health: patients may not use tools that physicians promote, and physicians may not endorse tools that patients will use. Further research investigating the concordance gap, and tensions between physician preferences and patient needs, is required. PMID:25608628

  20. Oral biopharmaceutics tools - time for a new initiative - an introduction to the IMI project OrBiTo.

    PubMed

    Lennernäs, H; Aarons, L; Augustijns, P; Beato, S; Bolger, M; Box, K; Brewster, M; Butler, J; Dressman, J; Holm, R; Julia Frank, K; Kendall, R; Langguth, P; Sydor, J; Lindahl, A; McAllister, M; Muenster, U; Müllertz, A; Ojala, K; Pepin, X; Reppas, C; Rostami-Hodjegan, A; Verwei, M; Weitschies, W; Wilson, C; Karlsson, C; Abrahamsson, B

    2014-06-16

    OrBiTo is a new European project within the IMI programme in the area of oral biopharmaceutics tools that includes world leading scientists from nine European universities, one regulatory agency, one non-profit research organization, four SMEs together with scientists from twelve pharmaceutical companies. The OrBiTo project will address key gaps in our knowledge of gastrointestinal (GI) drug absorption and deliver a framework for rational application of predictive biopharmaceutics tools for oral drug delivery. This will be achieved through novel prospective investigations to define new methodologies as well as refinement of existing tools. Extensive validation of novel and existing biopharmaceutics tools will be performed using active pharmaceutical ingredient (API), formulations and supporting datasets from industry partners. A combination of high quality in vitro or in silico characterizations of API and formulations will be integrated into physiologically based in silico biopharmaceutics models capturing the full complexity of GI drug absorption. This approach gives an unparalleled opportunity to initiate a transformational change in industrial research and development to achieve model-based pharmaceutical product development in accordance with the Quality by Design concept. Benefits include an accelerated and more efficient drug candidate selection, formulation development process, particularly for challenging projects such as low solubility molecules (BCS II and IV), enhanced and modified-release formulations, as well as allowing optimization of clinical product performance for patient benefit. In addition, the tools emerging from OrBiTo are expected to significantly reduce demand for animal experiments in the future as well as reducing the number of human bioequivalence studies required to bridge formulations after manufacturing or composition changes. Copyright © 2013 Elsevier B.V. All rights reserved.

Top