Science.gov

Sample records for adaptive automation design

  1. Adaptive Automation Design and Implementation

    DTIC Science & Technology

    2015-09-17

    function instantiation. Finally, we develop five analysis tools for isolat ing effective AA points within a human -machine system. A function is an action... analysis tools allowing designers to identify points within a function network where the transitions between human and machine entities can facilitate...based on the four-stages of human information processing: sensory processing, perception /- working memory, decision making, and response selection

  2. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  3. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  4. An Approach to Automated Fusion System Design and Adaptation

    PubMed Central

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-01-01

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762

  5. Theory and Design of Adaptive Automation in Aviation Systems

    DTIC Science & Technology

    1992-07-17

    and KOALAS programs at the Naval Air Development Center. The latter is an intelligent, man-in-the-loop, architecture that is presently being...7 This approach has been applied as a model in the Knowledgeable Operator Analysis- Unked System ( KOALAS ) for decision aiding and has in turn been...Barrett,C.L. (1988) The Knowledgeable Ooerator Analysis-Linked Advisory System ( KOALAS ) A1oroach to Decision Suooort System Design. Analysis and Synthesis

  6. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-06-12

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  7. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  8. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  9. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  10. Effects of adaptive task allocation on monitoring of automated systems.

    PubMed

    Parasuraman, R; Mouloua, M; Molloy, R

    1996-12-01

    The effects of adaptive task allocation on monitoring for automation failure during multitask flight simulation were examined. Participants monitored an automated engine status task while simultaneously performing tracking and fuel management tasks over three 30-min sessions. Two methods of adaptive task allocation, both involving temporary return of the automated engine status task to the human operator ("human control"), were examined as a possible countermeasure to monitoring inefficiency. For the model-based adaptive group, the engine status task was allocated to all participants in the middle of the second session for 10 min, following which it was again returned to automation control. The same occurred for the performance-based adaptive group, but only if an individual participant's monitoring performance up to that point did not meet a specified criterion. For the nonadaptive control groups, the engine status task remained automated throughout the experiment. All groups had low probabilities of detection of automation failures for the first 40 min spent with automation. However, following the 10-min intervening period of human control, both adaptive groups detected significantly more automation failures during the subsequent blocks under automation control. The results show that adaptive task allocation can enhance monitoring of automated systems. Both model-based and performance-based allocation improved monitoring of automation. Implications for the design of automated systems are discussed.

  11. Operator versus computer control of adaptive automation

    NASA Technical Reports Server (NTRS)

    Hilburn, Brian; Molloy, Robert; Wong, Dick; Parasuraman, Raja

    1993-01-01

    Adaptive automation refers to real-time allocation of functions between the human operator and automated subsystems. The article reports the results of a series of experiments whose aim is to examine the effects of adaptive automation on operator performance during multi-task flight simulation, and to provide an empirical basis for evaluations of different forms of adaptive logic. The combined results of these studies suggest several things. First, it appears that either excessively long, or excessively short, adaptation cycles can limit the effectiveness of adaptive automation in enhancing operator performance of both primary flight and monitoring tasks. Second, occasional brief reversions to manual control can counter some of the monitoring inefficiency typically associated with long cycle automation, and further, that benefits of such reversions can be sustained for some time after return to automated control. Third, no evidence was found that the benefits of such reversions depend on the adaptive logic by which long-cycle adaptive switches are triggered.

  12. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  13. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  14. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  15. Design Automation in Synthetic Biology.

    PubMed

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks.

  16. Operator adaptation to changes in system reliability under adaptable automation.

    PubMed

    Chavaillaz, Alain; Sauer, Juergen

    2016-11-25

    This experiment examined how operators coped with a change in system reliability between training and testing. Forty participants were trained for 3 h on a complex process control simulation modelling six levels of automation (LOA). In training, participants either experienced a high- (100%) or low-reliability system (50%). The impact of training experience on operator behaviour was examined during a 2.5 h testing session, in which participants either experienced a high- (100%) or low-reliability system (60%). The results showed that most operators did not often switch between LOA. Most chose an LOA that relieved them of most tasks but maintained their decision authority. Training experience did not have a strong impact on the outcome measures (e.g. performance, complacency). Low system reliability led to decreased performance and self-confidence. Furthermore, complacency was observed under high system reliability. Overall, the findings suggest benefits of adaptable automation because it accommodates different operator preferences for LOA. Practitioner Summary: The present research shows that operators can adapt to changes in system reliability between training and testing sessions. Furthermore, it provides evidence that each operator has his/her preferred automation level. Since this preference varies strongly between operators, adaptable automation seems to be suitable to accommodate these large differences.

  17. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    declaration to be associated with it. Second, Byron tools can produce useful output from incomplete specifications. These advantages over pure Ada are...8217 implemented. Implementation independent details are included in Sec- tion 2.2. Requirements and Design information for the DARTS implemen- tation of both...The Intelligence Content (I) is an estimate of the Potential Volume. It is independent of the language used and is expected to be invariant over

  18. Automated Core Design

    SciTech Connect

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-07-15

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process.

  19. Adaptive control of surface finish in automated turning processes

    NASA Astrophysics Data System (ADS)

    García-Plaza, E.; Núñez, P. J.; Martín, A. R.; Sanz, A.

    2012-04-01

    The primary aim of this study was to design and develop an on-line control system of finished surfaces in automated machining process by CNC turning. The control system consisted of two basic phases: during the first phase, surface roughness was monitored through cutting force signals; the second phase involved a closed-loop adaptive control system based on data obtained during the monitoring of the cutting process. The system ensures that surfaces roughness is maintained at optimum values by adjusting the feed rate through communication with the PLC of the CNC machine. A monitoring and adaptive control system has been developed that enables the real-time monitoring of surface roughness during CNC turning operations. The system detects and prevents faults in automated turning processes, and applies corrective measures during the cutting process that raise quality and reliability reducing the need for quality control.

  20. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability.

  1. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  2. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  3. Automation design of cemented doublet

    NASA Astrophysics Data System (ADS)

    Romanova, Galina; Ivanova, Tatiana; Korotkova, Natalia

    2015-09-01

    Algorithm and software for cemented doublet synthesis by Slusarev's methodology are presented. Slusarev's methodology is based on lookup tables that allow calculating doublet radii by given value of third-order coma, spherical aberration and chromatic aberration by specific algorithm. This calculation is automated in this work. The input parameters for algorithm are desired values of third-order coma, spherical aberration and chromatic aberration of cemented doublet. The software looks up few pairs of optical glasses corresponding to specified value of chromatic aberration and then calculates radii of surfaces for each pair of glasses corresponding to specified third-order coma and spherical aberration. The resulted third-order aberrations and real aberrations on the edge of the pupil are calculated for obtained radiuses. Several doublets can be analyzed in result table and the chosen one can be imported into Zemax. The calculated cemented doublet parameters can be analyzed and optimized in optical system design software. The software allows to make the first step of optical system design fast and simple. It allows to design not only the system which is free of the third-order spherical aberration, coma and axial color, but obtain necessary value of aberration for compensation of aberrations in another part of optical system. Possibility to look up optical glasses automatically, what affects the chromatic aberration correction and aberration correction in general, is especially important. Examples of automatic calculation of cemented doublet and compensation of aberrations in another part of optical system are presented in the paper.

  4. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions.

  5. Generic Automated Multi-function Finger Design

    NASA Astrophysics Data System (ADS)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  6. Adaptive Sampling Designs.

    ERIC Educational Resources Information Center

    Flournoy, Nancy

    Designs for sequential sampling procedures that adapt to cumulative information are discussed. A familiar illustration is the play-the-winner rule in which there are two treatments; after a random start, the same treatment is continued as long as each successive subject registers a success. When a failure occurs, the other treatment is used until…

  7. Rebound: A Framework for Automated Component Adaptation

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    The REBOUND adaptation framework organizes a collection of adaptation tactics in a way that they can be selected based on the components available for adaptation. Adaptation tactics are specified formally in terms of the relationship between the component to be adapted and the resulting adapted component. The tactic specifications are used as matching conditions for specification-based component retrieval, creating a 'retrieval for adaptation' scenario. The results of specification matching are used to guide component adaptation. Several examples illustrate how the framework guides component and tactic selection and how basic tactics are composed to form more powerful tactics.

  8. Automated Hardware Design via Evolutionary Search

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.

    2000-01-01

    The goal of this research is to investigate the application of evolutionary search to the process of automated engineering design. Evolutionary search techniques involve the simulation of Darwinian mechanisms by computer algorithms. In recent years, such techniques have attracted much attention because they are able to tackle a wide variety of difficult problems and frequently produce acceptable solutions. The results obtained are usually functional, often surprising, and typically "messy" because the algorithms are told to concentrate on the overriding objective and not elegance or simplicity. advantages. First, faster design cycles translate into time and, hence, cost savings. Second, automated design techniques can be made to scale well and hence better deal with increasing amounts of design complexity. Third, design quality can increase because design properties can be specified a priori. For example, size and weight specifications of a device, smaller and lighter than the best known design, might be optimized by the automated design technique. The domain of electronic circuit design is an advantageous platform in which to study automated design techniques because it is a rich design space that is well understood, permitting human-created designs to be compared to machine- generated designs. developed for circuit design was to automatically produce high-level integrated electronic circuit designs whose properties permit physical implementation in silicon. This process entailed designing an effective evolutionary algorithm and solving a difficult multiobjective optimization problem. FY 99 saw many accomplishments in this effort.

  9. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  10. INITIATORS AND TRIGGERING CONDITIONS FOR ADAPTIVE AUTOMATION IN ADVANCED SMALL MODULAR REACTORS

    SciTech Connect

    Katya L Le Blanc; Johanna h Oxstrand

    2014-04-01

    It is anticipated that Advanced Small Modular Reactors (AdvSMRs) will employ high degrees of automation. High levels of automation can enhance system performance, but often at the cost of reduced human performance. Automation can lead to human out-of the loop issues, unbalanced workload, complacency, and other problems if it is not designed properly. Researchers have proposed adaptive automation (defined as dynamic or flexible allocation of functions) as a way to get the benefits of higher levels of automation without the human performance costs. Adaptive automation has the potential to balance operator workload and enhance operator situation awareness by allocating functions to the operators in a way that is sensitive to overall workload and capabilities at the time of operation. However, there still a number of questions regarding how to effectively design adaptive automation to achieve that potential. One of those questions is related to how to initiate (or trigger) a shift in automation in order to provide maximal sensitivity to operator needs without introducing undesirable consequences (such as unpredictable mode changes). Several triggering mechanisms for shifts in adaptive automation have been proposed including: operator initiated, critical events, performance-based, physiological measurement, model-based, and hybrid methods. As part of a larger project to develop design guidance for human-automation collaboration in AdvSMRs, researchers at Idaho National Laboratory have investigated the effectiveness and applicability of each of these triggering mechanisms in the context of AdvSMR. Researchers reviewed the empirical literature on adaptive automation and assessed each triggering mechanism based on the human-system performance consequences of employing that mechanism. Researchers also assessed the practicality and feasibility of using the mechanism in the context of an AdvSMR control room. Results indicate that there are tradeoffs associated with each

  11. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  12. Design considerations for automated packaging operations

    SciTech Connect

    Fahrenholtz, J.; Jones, J.; Kincy, M.

    1993-12-31

    The paper is based on work performed at Sandia National Laboratories to automate DOE packaging operations. It is a general summary of work from several projects which may be applicable to other packaging operations. Examples are provided of robotic operations which have been demonstrated as well as operations that are currently being developed. General design considerations for packages and for automated handling systems are described.

  13. An Automated Approach to Instructional Design Guidance.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…

  14. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  15. Automated adaptive inference of phenomenological dynamical models

    NASA Astrophysics Data System (ADS)

    Daniels, Bryan C.; Nemenman, Ilya

    2015-08-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  16. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  17. Automated fully-stressed design with NASTRAN

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.; Haggenmacher, G. W.

    1976-01-01

    An automated strength sizing capability is described. The technique determines the distribution of material among the elements of a structural model. The sizing is based on either a fully stressed design or a scaled feasible fully stressed design. Results obtained from the application of the strength sizing to the structural sizing of a composite material wing box using material strength allowables are presented. These results demonstrate the rapid convergence of the structural sizes to a usable design.

  18. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  19. CMOS array design automation techniques

    NASA Technical Reports Server (NTRS)

    Lombardi, T.; Feller, A.

    1976-01-01

    The design considerations and the circuit development for a 4096-bit CMOS SOS ROM chip, the ATL078 are described. Organization of the ATL078 is 512 words by 8 bits. The ROM was designed to be programmable either at the metal mask level or by a directed laser beam after processing. The development of a 4K CMOS SOS ROM fills a void left by available ROM chip types, and makes the design of a totally major high speed system more realizable.

  20. Automated solar collector installation design

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-08-26

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives.

  1. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    PubMed

    Otero-Muras, Irene; Banga, Julio R

    2017-04-12

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  2. Automated mixed traffic vehicle design AMTV 2

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Marks, R. A.; Cassell, P. L.

    1982-01-01

    The design of an improved and enclosed Automated Mixed Traffic Transit (AMTT) vehicle is described. AMTT is an innovative concept for low-speed tram-type transit in which suitable vehicles are equipped with sensors and controls to permit them to operate in an automated mode on existing road or walkway surfaces. The vehicle chassis and body design are presented in terms of sketches and photographs. The functional design of the sensing and control system is presented, and modifications which could be made to the baseline design for improved performance, in particular to incorporate a 20-mph capability, are also discussed. The vehicle system is described at the block-diagram-level of detail. Specifications and parameter values are given where available.

  3. Automated design of controlled diffusion blades

    NASA Technical Reports Server (NTRS)

    Sanz, Jose M.

    1989-01-01

    A numerical automation procedure was developed to be used in conjunction with an inverse hodograph method for the design of controlled diffusion blades. With this procedure a cascade of airfoils with a prescribed solidity, inlet Mach No., inlet air flow angle and air flow turning can be produced automatically. The trailing edge thickness of the airfoil, an important quantity in inverse methods, is also prescribed. The automation procedure consists of a multi-dimensional Newton iteration in which the objective design conditions are achieved by acting on the hodograph input parameters of the underlying inverse code. The method, although more general in scope, is applied to the design of axial flow turbomachinery blade sections, both compressors and turbines. A collaborative effort with U.S. Engine Companies to identify designs of interest to the industry will be described.

  4. Automated lower limb prosthesis design

    NASA Astrophysics Data System (ADS)

    Bhatia, Gulab H.; Commean, Paul K.; Smith, Kirk E.; Vannier, Michael W.

    1994-09-01

    The design of lower limb prostheses requires definitive geometric data to customize socket shape. Optical surface imaging and spiral x-ray computed tomography were applied to geometric analysis of limb residua in below knee (BK) amputees. Residua (limb remnants after amputation) of BK amputees were digitized and measured. Surface (optical) and volumetric (CT) data of the residuum were used to generate solid models and specify socket shape in (SDRC I-DEAS) CAD software. Volume measurements on the solid models were found to correspond within 2% of surface models and direct determinations made using Archimedean weighing. Anatomic 3D reconstruction of the residuum by optical surface and spiral x-ray computed tomography imaging are feasible modalities for prosthesis design.

  5. Design Automation for Streaming Systems

    DTIC Science & Technology

    2005-12-16

    AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14 . ABSTRACT RTL design methodologies are struggling to...transistor systems— AMD Athlon 64 X2: 233M transistors [Mitchell, 2005], IBM Cell: 241M transistors [Gschwind et al., 2005], nVidia GeForce 7800 GTX: 302M...buffers may be chosen by a compiler to match the resources 14 Chapter 1. Introduction of the target device. A small stream buffer might be implemented

  6. Automated design of controlled diffusion blades

    NASA Technical Reports Server (NTRS)

    Sanz, Jose M.

    1987-01-01

    A numerical automation procedure has been developed to be used in conjunction with an inverse hodograph method for the design of controlled diffusion blades. With this procedure a cascade of airfoils with a prescribed solidity, inlet Mach number, inlet air flow angle, and air flow turning can be produced automatically. The trailing edge thickness of the airfoil, an important quantity in inverse methods, is also prescribed. The automation procedure consists of a multidimensional Newton iteration in which the objective design conditions are achieved by acting on the hodograph input parameters of the underlying inverse code. The method, although more general in scope, is applied in this paper to the design of axial flow compressor blade sections, and a wide range of examples is presented.

  7. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  8. Automated design of controlled-diffusion blades

    NASA Technical Reports Server (NTRS)

    Sanz, J. M.

    1988-01-01

    A numerical automation procedure has been developed to be used in conjunction with an inverse hodograph method for the design of controlled diffusion blades. With this procedure a cascade of airfoils with a prescribed solidity, inlet Mach number, inlet air flow angle, and air flow turning can be produced automatically. The trailing edge thickness of the airfoil, an important quantity in inverse methods, is also prescribed. The automation procedure consists of a multidimensional Newton iteration in which the objective design conditions are achieved by acting on the hodograph input parameters of the underlying inverse code. The method, although more general in scope, is applied in this paper to the design of axial flow compressor blade sections, and a wide range of examples is presented.

  9. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  10. Psychophysiological Control of Acognitive Task Using Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Freeman, Frederick; Pope, Alan T. (Technical Monitor)

    2001-01-01

    The major focus of the present proposal was to examine psychophysiological variables related to hazardous states of awareness induced by monitoring automated systems. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While both performance based and model based adaptive automation have been studied, the use of psychophysiological measures, especially EEG, offers the advantage of real time evaluation of the state of the subject. The current study used the closed-loop system, developed at NASA-Langley Research Center, to control the state of awareness of subjects while they performed a cognitive vigilance task. Previous research in our laboratory, supported by NASA, has demonstrated that, in an adaptive automation, closed-loop environment, subjects perform a tracking task better under a negative than a positive, feedback condition. In addition, this condition produces less subjective workload and larger P300 event related potentials to auditory stimuli presented in a concurrent oddball task. We have also recently shown that the closed-loop system used to control the level of automation in a tracking task can also be used to control the event rate of stimuli in a vigilance monitoring task. By changing the event rate based on the subject's index of arousal, we have been able to produce improved monitoring, relative to various control groups. We have demonstrated in our initial closed-loop experiments with the the vigilance paradigm that using a negative feedback contingency (i.e. increasing event rates when the EEG index is low and decreasing event rates when

  11. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design.

  12. Expert System for Automated Design Synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Barthelemy, Jean-Francois M.

    1987-01-01

    Expert-system computer program EXADS developed to aid users of Automated Design Synthesis (ADS) general-purpose optimization program. EXADS aids engineer in determining best combination based on knowledge of specific problem and expert knowledge stored in knowledge base. Available in two interactive machine versions. IBM PC version (LAR-13687) written in IQ-LISP. DEC VAX version (LAR-13688) written in Franz-LISP.

  13. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  14. Adaptable and Adaptive Automation for Supervisory Control of Multiple Autonomous Vehicles

    DTIC Science & Technology

    2012-10-01

    Adaptable and Adaptive Automation for Supervisory Control of Multiple Autonomous Vehicles Brian Kidwell , 1 Gloria L. Calhoun, 2 Heath A. Ruff...correlated with selection of the high LOA ( r = .789, p < .01), as well as the disuse of the medium LOA ( r = -.823, p < .01). There was not a...AFRL. Brian Kidwell and Raja Parasuraman were supported by Air Force Office of Scientific Research grant FA9550-10-1-0385 and the Center of

  15. Design of the hybrid automated reliability predictor

    NASA Technical Reports Server (NTRS)

    Geist, R.; Trivedi, K.; Dugan, J. B.; Smotherman, M.

    1983-01-01

    The design of the Hybrid Automated Reliability Predictor (HARP), now under development at Duke University, is presented. The HARP approach to reliability prediction is characterized by a decomposition of the overall model into fault-occurrence and fault-handling sub-models. The fault-occurrence model is a non-homogeneous Markov chain which is solved analytically, while the fault-handling model is a Petri Net which is simulated. HARP provides automated analysis of sensitivity to uncertainties in the input parameters and in the initial state specifications. It then produces a predicted reliability band as a function of mission time, as well as estimates of the improvement (narrowing of the band) to be gained by a specified amount of reduction in uncertainty.

  16. Automated design and fabrication of ocular optics

    NASA Astrophysics Data System (ADS)

    Gutin, Mikhail; Gutin, Olga

    2008-08-01

    Automated computer-aided procedure for component selection, optical design, and optimization was developed and used to produce prototype ocular optics of a head-mounted display for biomedical imaging, with the field of view and resolution approaching those of normal human vision. The new display has the potential to dramatically increase the amount and fidelity of real-time visual information presented to the user. The selected approach was based on a tiled configuration and "optically stitched" virtual image, resulting in seamless imagery generated by multiple micro-displays. Several optical configurations were studied in the design stage, to arrive at the optimal optical layout. The automated procedure provided for extensive search of the best candidate stock components out of thousands of candidate lenses offered by different vendors. At each iteration, the candidate lens was "digitally inserted" in the optical layout, its position was optimized, and the achieved merit function characterizing the quality of the stitched image was stored, along with the design prescription. A few best designs were then closely evaluated in a traditional "manual" procedure. The design effort was followed by experimental demonstration and tests of a limited prototype optical system.

  17. Automated design of ligands to polypharmacological profiles

    PubMed Central

    Besnard, Jérémy; Ruda, Gian Filippo; Setola, Vincent; Abecassis, Keren; Rodriguiz, Ramona M.; Huang, Xi-Ping; Norval, Suzanne; Sassano, Maria F.; Shin, Antony I.; Webster, Lauren A.; Simeons, Frederick R.C.; Stojanovski, Laste; Prat, Annik; Seidah, Nabil G.; Constam, Daniel B.; Bickerton, G. Richard; Read, Kevin D.; Wetsel, William C.; Gilbert, Ian H.; Roth, Bryan L.; Hopkins, Andrew L.

    2012-01-01

    The clinical efficacy and safety of a drug is determined by its activity profile across multiple proteins in the proteome. However, designing drugs with a specific multi-target profile is both complex and difficult. Therefore methods to rationally design drugs a priori against profiles of multiple proteins would have immense value in drug discovery. We describe a new approach for the automated design of ligands against profiles of multiple drug targets. The method is demonstrated by the evolution of an approved acetylcholinesterase inhibitor drug into brain penetrable ligands with either specific polypharmacology or exquisite selectivity profiles for G-protein coupled receptors. Overall, 800 ligand-target predictions of prospectively designed ligands were tested experimentally, of which 75% were confirmed correct. We also demonstrate target engagement in vivo. The approach can be a useful source of drug leads where multi-target profiles are required to achieve either selectivity over other drug targets or a desired polypharmacology. PMID:23235874

  18. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Designing Automated Guidance for Concept Diagrams in Inquiry Instruction

    ERIC Educational Resources Information Center

    Ryoo, Kihyun; Linn, Marcia C.

    2016-01-01

    Advances in automated scoring technologies have the potential to support student learning during inquiry instruction by providing timely and adaptive guidance on individual students' responses. To identify which forms of automated guidance can be beneficial for inquiry learning, we compared reflective guidance to directive guidance for…

  20. Automated Procedure for Roll Pass Design

    NASA Astrophysics Data System (ADS)

    Lambiase, F.; Langella, A.

    2009-04-01

    The aim of this work has been to develop an automatic roll pass design method, capable of minimizing the number of roll passes. The adoption of artificial intelligence technologies, particularly expert systems, and a hybrid model for the surface profile evaluation of rolled bars, has allowed us to model the search for the minimal sequence with a tree path search. This approach permitted a geometrical optimization of roll passes while allowing automation of the roll pass design process. Moreover, the heuristic nature of the inferential engine contributes a great deal toward reducing search time, thus allowing such a system to be employed for industrial purposes. Finally, this new approach was compared with other recently developed automatic systems to validate and measure possible improvements among them.

  1. Automating analog design: Taming the shrew

    NASA Technical Reports Server (NTRS)

    Barlow, A.

    1990-01-01

    The pace of progress in the design of integrated circuits continues to amaze observers inside and outside of the industry. Three decades ago, a 50 transistor chip was a technological wonder. Fifteen year later, a 5000 transistor device would 'wow' the crowds. Today, 50,000 transistor chips will earn a 'not too bad' assessment, but it takes 500,000 to really leave an impression. In 1975 a typical ASIC device had 1000 transistors, took one year to first samples (and two years to production) and sold for about 5 cents per transistor. Today's 50,000 transistor gate array takes about 4 months from spec to silicon, works the first time, and sells for about 0.02 cents per transistor. Fifteen years ago, the single most laborious and error prone step in IC design was the physical layout. Today, most IC's never see the hand of a layout designer: and automatic place and route tool converts the engineer's computer captured schematic to a complete physical design using a gate array or a library of standard cells also created by software rather than by designers. CAD has also been a generous benefactor to the digital design process. The architect of today's digital systems creates the design using an RTL or other high level simulator. Then the designer pushes a button to invoke the logic synthesizer-optimizer tool. A fault analyzer checks the result for testability and suggests where scan based cells will improve test coverage. One obstinate holdout amidst this parade of progress is the automation of analog design and its reduction to semi-custom techniques. This paper investigates the application of CAD techniques to analog design.

  2. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  3. Automated Antenna Design with Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Globus, Al; Linden, Derek S.; Lohn, Jason D.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to

  4. Automated Design Space Exploration with Aspen

    DOE PAGES

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  5. FRACSAT: Automated design synthesis for future space architectures

    NASA Astrophysics Data System (ADS)

    Mackey, R.; Uckun, S.; Do, Minh; Shah, J.

    This paper describes the algorithmic basis and development of FRACSAT (FRACtionated Spacecraft Architecture Toolkit), a new approach to conceptual design, cost-benefit analysis, and detailed trade studies for space systems. It provides an automated capability for exploration of candidate spacecraft architectures, leading users to near-optimal solutions with respect to user-defined requirements, risks, and program uncertainties. FRACSAT utilizes a sophisticated planning algorithm (PlanVisioner) to perform a quasi-exhaustive search for candidate architectures, constructing candidates from an extensible model-based representation of space system components and functions. These candidates are then evaluated with emphasis on the business case, computing the expected design utility and system costs as well as risk, presenting the user with a greatly reduced selection of candidates. The user may further refine the search according to cost or benefit uncertainty, adaptability, or other performance metrics as needed.

  6. Designing automation for complex work environments under different levels of stress.

    PubMed

    Sauer, Juergen; Nickel, Peter; Wastell, David

    2013-01-01

    This article examines the effectiveness of different forms of static and adaptable automation under low- and high-stress conditions. Forty participants were randomly assigned to one of four experimental conditions, comparing three levels of static automation (low, medium and high) and one level of adaptable automation, with the environmental stressor (noise) being varied as a within-subjects variable. Participants were trained for 4 h on a simulation of a process control environment, called AutoCAMS, followed by a 2.5-h testing session. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that operators preferred higher levels of automation under noise than under quiet conditions. A number of parameters indicated negative effects of noise exposure, such as performance impairments, physiological stress reactions and higher mental workload. It also emerged that adaptable automation provided advantages over low and intermediate static automation, with regard to mental workload, effort expenditure and diagnostic performance. The article concludes that for the design of automation a wider range of operational scenarios reflecting adverse as well as ideal working conditions needs to be considered.

  7. Automated mechanical ventilation: adapting decision making to different disease states.

    PubMed

    Lozano-Zahonero, S; Gottlieb, D; Haberthür, C; Guttmann, J; Möller, K

    2011-03-01

    The purpose of the present study is to introduce a novel methodology for adapting and upgrading decision-making strategies concerning mechanical ventilation with respect to different disease states into our fuzzy-based expert system, AUTOPILOT-BT. The special features are: (1) Extraction of clinical knowledge in analogy to the daily routine. (2) An automated process to obtain the required information and to create fuzzy sets. (3) The controller employs the derived fuzzy rules to achieve the desired ventilation status. For demonstration this study focuses exclusively on the control of arterial CO(2) partial pressure (p(a)CO(2)). Clinical knowledge from 61 anesthesiologists was acquired using a questionnaire from which different disease-specific fuzzy sets were generated to control p(a)CO(2). For both, patients with healthy lung and with acute respiratory distress syndrome (ARDS) the fuzzy sets show different shapes. The fuzzy set "normal", i.e., "target p(a)CO(2) area", ranges from 35 to 39 mmHg for healthy lungs and from 39 to 43 mmHg for ARDS lungs. With the new fuzzy sets our AUTOPILOT-BT reaches the target p(a)CO(2) within maximal three consecutive changes of ventilator settings. Thus, clinical knowledge can be extended, updated, and the resulting mechanical ventilation therapies can be individually adapted, analyzed, and evaluated.

  8. Design automation for wafer scale integration

    SciTech Connect

    Donlan, B.J.

    1986-01-01

    Wafer scale integration (WSI) is a technique for implementing large digital systems on a single wafer. This thesis describes a system of design automation tools developed to aid in the implementation of wafer scale integrated systems. An overview of wafer scale integration is given with fabrication details and yield considerations discussed. The Wafer architectural Design Language (WDL) used to describe and specify a system architecture to the development system is introduced along with a compiler that translates the high level WDL description into net lists and other internal data bases. Interactive placement tools used to map the system architecture onto the functional die sites on a wafer are described. A very fast line probe router was developed to perform the custom wafer level routing need to personalize each wafer. Router data structures, algorithms, techniques, and results are discussed in detail. Sample wafer scale architectures and the result of their WSI implementations are shown. Also presented is the Wafer Transmission Module (WTM) a packaging technology related to wafer scale integration.

  9. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  10. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    NASA Astrophysics Data System (ADS)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  11. A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation

    NASA Technical Reports Server (NTRS)

    Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.

    2004-01-01

    Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed

  12. Adaptive clinical trial designs in oncology

    PubMed Central

    Zang, Yong; Lee, J. Jack

    2015-01-01

    Adaptive designs have become popular in clinical trial and drug development. Unlike traditional trial designs, adaptive designs use accumulating data to modify the ongoing trial without undermining the integrity and validity of the trial. As a result, adaptive designs provide a flexible and effective way to conduct clinical trials. The designs have potential advantages of improving the study power, reducing sample size and total cost, treating more patients with more effective treatments, identifying efficacious drugs for specific subgroups of patients based on their biomarker profiles, and shortening the time for drug development. In this article, we review adaptive designs commonly used in clinical trials and investigate several aspects of the designs, including the dose-finding scheme, interim analysis, adaptive randomization, biomarker-guided randomization, and seamless designs. For illustration, we provide examples of real trials conducted with adaptive designs. We also discuss practical issues from the perspective of using adaptive designs in oncology trials. PMID:25811018

  13. Design of automated system for management of arrival traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1989-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. The design of two of these tools, the Descent Advisor, which provides automation tools for managing descent traffic, and the Traffic Management Advisor, which generates optimum landing schedules is focused on. The algorithms, automation modes, and graphical interfaces incorporated in the design are described.

  14. Intelligent Adaptive Interface: A Design Tool for Enhancing Human-Machine System Performances

    DTIC Science & Technology

    2009-10-01

    environment) Monitoring • OMI Design Guidelines • Automation-design Principles • OMI Design Guidelines • HCI Principles Adapt OMI Automate / Aid...technical systems, there is still a lack of well-established design guidelines for these human-machine systems, especially for advanced operator...Additionally, a lack of integration between the Human Factors (HF) and Human Computer Interaction ( HCI ) domains has increased the tendency for terminology

  15. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  16. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  17. ERIS adaptive optics system design

    NASA Astrophysics Data System (ADS)

    Marchetti, Enrico; Le Louarn, Miska; Soenke, Christian; Fedrigo, Enrico; Madec, Pierre-Yves; Hubin, Norbert

    2012-07-01

    The Enhanced Resolution Imager and Spectrograph (ERIS) is the next-generation instrument planned for the Very Large Telescope (VLT) and the Adaptive Optics facility (AOF). It is an AO assisted instrument that will make use of the Deformable Secondary Mirror and the new Laser Guide Star Facility (4LGSF), and it is planned for the Cassegrain focus of the telescope UT4. The project is currently in its Phase A awaiting for approval to continue to the next phases. The Adaptive Optics system of ERIS will include two wavefront sensors (WFS) to maximize the coverage of the proposed sciences cases. The first is a high order 40x40 Pyramid WFS (PWFS) for on axis Natural Guide Star (NGS) observations. The second is a high order 40x40 Shack-Hartmann WFS for single Laser Guide Stars (LGS) observations. The PWFS, with appropriate sub-aperture binning, will serve also as low order NGS WFS in support to the LGS mode with a field of view patrolling capability of 2 arcmin diameter. Both WFSs will be equipped with the very low read-out noise CCD220 based camera developed for the AOF. The real-time reconstruction and control is provided by a SPARTA real-time platform adapted to support both WFS modes. In this paper we will present the ERIS AO system in all its main aspects: opto-mechanical design, real-time computer design, control and calibrations strategy. Particular emphasis will be given to the system performance obtained via dedicated numerical simulations.

  18. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  19. Achieving runtime adaptability through automated model evolution and variant selection

    NASA Astrophysics Data System (ADS)

    Mosincat, Adina; Binder, Walter; Jazayeri, Mehdi

    2014-01-01

    Dynamically adaptive systems propose adaptation by means of variants that are specified in the system model at design time and allow for a fixed set of different runtime configurations. However, in a dynamic environment, unanticipated changes may result in the inability of the system to meet its quality requirements. To allow the system to react to these changes, this article proposes a solution for automatically evolving the system model by integrating new variants and periodically validating the existing ones based on updated quality parameters. To illustrate this approach, the article presents a BPEL-based framework using a service composition model to represent the functional requirements of the system. The framework estimates quality of service (QoS) values based on information provided by a monitoring mechanism, ensuring that changes in QoS are reflected in the system model. The article shows how the evolved model can be used at runtime to increase the system's autonomic capabilities and delivered QoS.

  20. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  1. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  2. Design Methodology for Automated Construction Machines

    DTIC Science & Technology

    1987-12-11

    positioning of structural members, and tunneling [3,5]. U.S. construction firms typically allocate little or no funding for R & D; progress to date...addition, four general purpose robots for building construction tasks [1] and the automation of sandblasting and concrete formwork cleaning have been

  3. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  4. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  5. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    SciTech Connect

    Williams, Joshua M.

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates

  6. Designing for Productive Adaptations of Curriculum Interventions

    ERIC Educational Resources Information Center

    Debarger, Angela Haydel; Choppin, Jeffrey; Beauvineau, Yves; Moorthy, Savitha

    2013-01-01

    Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and…

  7. Automated Assistance for Designing Active Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Imlach, Joseph

    2008-01-01

    MagBear12 is a computer code that assists in the design of radial, heteropolar active magnetic bearings (AMBs). MagBear12 was developed to help in designing the system described in "Advanced Active-Magnetic-Bearing Thrust-Measurement System". Beyond this initial application, MagBear12 is expected to be useful for designing AMBs for a variety of rotating machinery. This program incorporates design rules and governing equations that are also implemented in other, proprietary design software used by AMB manufacturers. In addition, this program incorporates an advanced unpublished fringing-magnetic-field model that increases accuracy beyond that offered by the other AMB-design software.

  8. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  9. Conceptual design of an aircraft automated coating removal system

    SciTech Connect

    Baker, J.E.; Draper, J.V.; Pin, F.G.; Primm, A.H.; Shekhar, S.

    1996-05-01

    Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which is semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).

  10. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  11. Design flow automation for variable-shaped beam pattern generators

    NASA Astrophysics Data System (ADS)

    Bloecker, Martin; Ballhorn, Gerd

    2002-07-01

    Raster scan pattern generators have been used in the photomask industry for many years. Methods and software tools for data preparation for these pattern generators are well established and have been integrated into design flows with a high degree of automation. But the growing requirements for pattern fidelity have lead to the introduction of 50 kV variable shaped beam pattern generators. Due to their different writing strategy these tools use proprietary data formats and in turn require an optimized data preparation. As a result the existing design flow has to be adopted to account for these requirements. Due to the fact that cycle times have grown severely over the last years the automation of this adopted design flow will not only enhance the design flow quality by avoiding errors during manual operations but will also help to reduce turn-around times. We developed and implemented an automated design flow for a variable shaped beam pattern generator which had to fulfill two conflicting requirements: Well established automated tools originally developed for raster scan pattern generators had to be retained with only slight modifications to avoid the (re)implementation and the concurrent usage of two systems while on the other hand data generation especially during fracturing had to be optimized for a variable shaped beam pattern generator.

  12. Generative Representations for Computer-Automated Design Systems

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.

  13. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  14. Drought Adaptation Mechanisms Should Guide Experimental Design.

    PubMed

    Gilbert, Matthew E; Medina, Viviana

    2016-08-01

    The mechanism, or hypothesis, of how a plant might be adapted to drought should strongly influence experimental design. For instance, an experiment testing for water conservation should be distinct from a damage-tolerance evaluation. We define here four new, general mechanisms for plant adaptation to drought such that experiments can be more easily designed based upon the definitions. A series of experimental methods are suggested together with appropriate physiological measurements related to the drought adaptation mechanisms. The suggestion is made that the experimental manipulation should match the rate, length, and severity of soil water deficit (SWD) necessary to test the hypothesized type of drought adaptation mechanism.

  15. A Case Study in CAD Design Automation

    ERIC Educational Resources Information Center

    Lowe, Andrew G.; Hartman, Nathan W.

    2011-01-01

    Computer-aided design (CAD) software and other product life-cycle management (PLM) tools have become ubiquitous in industry during the past 20 years. Over this time they have continuously evolved, becoming programs with enormous capabilities, but the companies that use them have not evolved their design practices at the same rate. Due to the…

  16. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  17. DESIGN OF SMALL AUTOMATION WORK CELL SYSTEM DEMONSTRATIONS

    SciTech Connect

    C. TURNER; J. PEHL; ET AL

    2000-12-01

    The introduction of automation systems into many of the facilities dealing with the production, use and disposition of nuclear materials has been an ongoing objective. Many previous attempts have been made, using a variety of monolithic and, in some cases, modular technologies. Many of these attempts were less than successful, owing to the difficulty of the problem, the lack of maturity of the technology, and over optimism about the capabilities of a particular system. Consequently, it is not surprising that suggestions that automation can reduce worker Occupational Radiation Exposure (ORE) levels are often met with skepticism and caution. The development of effective demonstrations of these technologies is of vital importance if automation is to become an acceptable option for nuclear material processing environments. The University of Texas Robotics Research Group (UTRRG) has been pursuing the development of technologies to support modular small automation systems (each of less than 5 degrees-of-freedom) and the design of those systems for more than two decades. Properly designed and implemented, these technologies have a potential to reduce the worker ORE associated with work in nuclear materials processing facilities. Successful development of systems for these applications requires the development of technologies that meet the requirements of the applications. These application requirements form a general set of rules that applicable technologies and approaches need to adhere to, but in and of themselves are generally insufficient for the design of a specific automation system. For the design of an appropriate system, the associated task specifications and relationships need to be defined. These task specifications also provide a means by which appropriate technology demonstrations can be defined. Based on the requirements and specifications of the operations of the Advanced Recovery and Integrated Extraction System (ARIES) pilot line at Los Alamos National

  18. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longuski, James M.; Bonfiglio, Eugene P.; Taylor, Irene (Technical Monitor)

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V_ for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites. hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  19. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longusaki, James M.; Bonfiglio, Eugene P.

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa, Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V(sub infinity) for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites, hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  20. Automated radiation hard ASIC design tool

    NASA Technical Reports Server (NTRS)

    White, Mike; Bartholet, Bill; Baze, Mark

    1993-01-01

    A commercial based, foundry independent, compiler design tool (ChipCrafter) with custom radiation hardened library cells is described. A unique analysis approach allows low hardness risk for Application Specific IC's (ASIC's). Accomplishments, radiation test results, and applications are described.

  1. CMOS-array design-automation techniques

    NASA Technical Reports Server (NTRS)

    Feller, A.; Lombardt, T.

    1979-01-01

    Thirty four page report discusses design of 4,096-bit complementary metal oxide semiconductor (CMOS) read-only memory (ROM). CMOSROM is either mask or laser programable. Report is divided into six sections; section one describes background of ROM chips; section two presents design goals for chip; section three discusses chip implementation and chip statistics; conclusions and recommendations are given in sections four thru six.

  2. Automated database design from natural language input

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos; Delaune, Carl

    1995-01-01

    Users and programmers of small systems typically do not have the skills needed to design a database schema from an English description of a problem. This paper describes a system that automatically designs databases for such small applications from English descriptions provided by end-users. Although the system has been motivated by the space applications at Kennedy Space Center, and portions of it have been designed with that idea in mind, it can be applied to different situations. The system consists of two major components: a natural language understander and a problem-solver. The paper describes briefly the knowledge representation structures constructed by the natural language understander, and, then, explains the problem-solver in detail.

  3. A compendium of controlled diffusion blades generated by an automated inverse design procedure

    NASA Technical Reports Server (NTRS)

    Sanz, Jose M.

    1989-01-01

    A set of sample cases was produced to test an automated design procedure developed at the NASA Lewis Research Center for the design of controlled diffusion blades. The range of application of the automated design procedure is documented. The results presented include characteristic compressor and turbine blade sections produced with the automated design code as well as various other airfoils produced with the base design method prior to the incorporation of the automated procedure.

  4. Towards automation of user interface design

    NASA Technical Reports Server (NTRS)

    Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst

    1992-01-01

    This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.

  5. Library Automation Design for Visually Impaired People

    ERIC Educational Resources Information Center

    Yurtay, Nilufer; Bicil, Yucel; Celebi, Sait; Cit, Guluzar; Dural, Deniz

    2011-01-01

    Speech synthesis is a technology used in many different areas in computer science. This technology can bring a solution to reading activity of visually impaired people due to its text to speech conversion. Based on this problem, in this study, a system is designed needed for a visually impaired person to make use of all the library facilities in…

  6. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  7. Design of automation tools for management of descent traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  8. Dynamic optimization and adaptive controller design

    NASA Astrophysics Data System (ADS)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  9. The Role of Operator State Assessment in Adaptive Automation

    DTIC Science & Technology

    2005-12-01

    rate, and blink rate respiration rate. Prinzel, Freeman, Scerbo, Milkulka, & Pope (1998) looked at Event Related Potentials ( ERP ) as another...psychophysiological measure for adaptive aiding. This is a change in the electro encephalogram (EEG) after a specific event. Several components in the ERP ...P300 component and perceptual/cognitive processing demands were on ERPs elicited on the secondary task; a method that is therefore intrusive. The

  10. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  11. Co-Adaptive Aiding and Automation Enhance Operator Performance

    DTIC Science & Technology

    2013-03-01

    Wilson for providing insightful comments on the design and analysis, and Dr. Joel Warm for many comments that have improved this manuscript. Lastly...weapon type was not counted as success. Participants could use the mouse to designate waypoints and direct RPAs away from pre-planned routes but were...lower on Day 3 than that observed with no aiding even though there was no significant difference in performance. The present work was not designed

  12. Automated control of an adaptive bi-hormonal, dual-sensor artificial pancreas and evaluation during inpatient studies

    PubMed Central

    Jacobs, Peter G.; El Youssef, Joseph; Castle, Jessica; Bakhtiani, Parkash; Branigan, Deborah; Breen, Matthew; Bauer, David; Preiser, Nicholas; Leonard, Gerald; Stonex, Tara; Preiser, Nicholas; Ward, W. Kenneth

    2014-01-01

    Automated control of blood glucose in patients with type 1 diabetes has not yet been fully implemented. The aim of this study was to design and clinically evaluate a system that integrates a control algorithm with off-the-shelf subcutaneous sensors and pumps to automate the delivery of the hormones glucagon and insulin in response to continuous glucose sensor measurements. The automated component of the system runs an adaptive proportional derivative control algorithm which determines hormone delivery rates based on the sensed glucose measurements and the meal announcements by the patient. We provide details about the system design and the control algorithm, which incorporates both a fading memory proportional derivative controller (FMPD) and an adaptive system for estimating changing sensitivity to insulin based on a glucoregulatory model of insulin action. For an inpatient study carried out in eight subjects using Dexcom SEVEN PLUS sensors, pre-study HbA1c averaged 7.6, which translates to an estimated average glucose of 171 mg/dL. In contrast, during use of the automated system, after initial stabilization, glucose averaged 145 mg/dL and subjects were kept within the euglycemic range (between 70 and 180 mg/dL) for 73.1% of the time, indicating improved glycemic control. A further study on five additional subjects in which we used a newer and more reliable glucose sensor (Dexcom G4 PLATINUM) and made improvements to the insulin and glucagon pump communication system resulted in elimination of hypoglycemic events. For this G4 study, the system was able to maintain subjects’ glucose levels within the near-euglycemic range for 71.6% of the study duration and the mean venous glucose level was 151 mg/dL. PMID:24835122

  13. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  14. Design methodology of an automated scattering measurement facility

    NASA Astrophysics Data System (ADS)

    Mazur, D. G.

    1985-12-01

    This thesis addresses the design methodology surrounding an automated scattering measurement facility. A brief historical survey of radar cross-section (RCS) measurements is presented. The electromagnetic theory associated with a continuous wave (CW) background cancellation technique for measuring RCS is discussed as background. In addition, problems associated with interfacing test equipment, data storage and output are addressed. The facility used as a model for this thesis is located at the Air Force Institute of Technology, WPARB, OH. The design methodology applies to any automated scattering measurement facility. A software package incorporating features that enhance the operation of AFIT's facility by students is presented. Finally, sample output from the software package illustrate formats for displaying RCS data.

  15. Automated interferometric synthetic aperture microscopy and computational adaptive optics for improved optical coherence tomography.

    PubMed

    Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott

    2016-03-10

    In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered.

  16. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    NASA Technical Reports Server (NTRS)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  17. Automated and Adaptive Mission Planning for Orbital Express

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Knight, Russell; Jones, Grailing; Tran, Daniel; Koblick, Darin

    2008-01-01

    The Orbital Express space mission was a Defense Advanced Research Projects Agency (DARPA) lead demonstration of on-orbit satellite servicing scenarios, autonomous rendezvous, fluid transfers of hydrazine propellant, and robotic arm transfers of Orbital Replacement Unit (ORU) components. Boeing's Autonomous Space Transport Robotic Operations (ASTRO) vehicle provided the servicing to the Ball Aerospace's Next Generation Serviceable Satellite (NextSat) client. For communication opportunities, operations used the high-bandwidth ground-based Air Force Satellite Control Network (AFSCN) along with the relatively low-bandwidth GEO-Synchronous space-borne Tracking and Data Relay Satellite System (TDRSS) network. Mission operations were conducted out of the RDT&E Support Complex (RSC) at the Kirtland Air Force Base in New Mexico. All mission objectives were met successfully: The first of several autonomous rendezvous was demonstrated on May 5, 2007; autonomous free-flyer capture was demonstrated on June 22, 2007; the fluid and ORU transfers throughout the mission were successful. Planning operations for the mission were conducted by a team of personnel including Flight Directors, who were responsible for verifying the steps and contacts within the procedures, the Rendezvous Planners who would compute the locations and visibilities of the spacecraft, the Scenario Resource Planners (SRPs), who were concerned with assignment of communications windows, monitoring of resources, and sending commands to the ASTRO spacecraft, and the Mission planners who would interface with the real-time operations environment, process planning products and coordinate activities with the SRP. The SRP position was staffed by JPL personnel who used the Automated Scheduling and Planning ENvironment (ASPEN) to model and enforce mission and satellite constraints. The lifecycle of a plan began three weeks outside its execution on-board. During the planning timeframe, many aspects could change the plan

  18. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  19. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  20. An automated quality assessor for Ada object-oriented designs

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.

    1988-01-01

    A tool for evaluating object-oriented designs (OODs) for Ada software is described. The tool assumes a design expressed as a hierarchy of object diagrams. A design of this type identifies the objects of a system, an interface to each object, and the usage relationships between objects. When such a design is implemented in Ada, objects become packages, interfaces become package specifications, and usage relationships become Ada `with' clauses and package references. An automated quality assessor has been developed that is based on flagging undesirable design constructs. For convenience, distinctions are made among three levels of severity: questionable, undesirable, and hazardous. A questionable construct is one that may well be appropriate. An undesirable construct is one that should be changed because it is potentially harmful to the reliability, maintainability, or reusability of the software. A hazardous construct is one that is undesirable and that introduces a high level of risk.

  1. Applying Utility Functions to Adaptation Planning for Home Automation Applications

    NASA Astrophysics Data System (ADS)

    Bratskas, Pyrros; Paspallis, Nearchos; Kakousis, Konstantinos; Papadopoulos, George A.

    A pervasive computing environment typically comprises multiple embedded devices that may interact together and with mobile users. These users are part of the environment, and they experience it through a variety of devices embedded in the environment. This perception involves technologies which may be heterogeneous, pervasive, and dynamic. Due to the highly dynamic properties of such environments, the software systems running on them have to face problems such as user mobility, service failures, or resource and goal changes which may happen in an unpredictable manner. To cope with these problems, such systems must be autonomous and self-managed. In this chapter we deal with a special kind of a ubiquitous environment, a smart home environment, and introduce a user-preference-based model for adaptation planning. The model, which dynamically forms a set of configuration plans for resources, reasons automatically and autonomously, based on utility functions, on which plan is likely to best achieve the user's goals with respect to resource availability and user needs.

  2. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  3. Adaptive Strategies for Materials Design using Uncertainties.

    PubMed

    Balachandran, Prasanna V; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young's (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don't. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  4. Adaptive control design for hysteretic smart systems

    NASA Astrophysics Data System (ADS)

    Fan, Xiang; Smith, Ralph C.

    2009-03-01

    Ferroelectric and ferromagnetic actuators are being considered for a range of industrial, aerospace, aeronautic and biomedical applications due to their unique transduction capabilities. However, they also exhibit hysteretic and nonlinear behavior that must be accommodated in models and control designs. If uncompensated, these effects can yield reduced system performance and, in the worst case, can produce unpredictable behavior of the control system. One technique for control design is to approximately linearize the actuator dynamics using an adaptive inverse compensator that is also able to accommodate model uncertainties and error introduced by the inverse algorithm. This paper describes the design of an adaptive inverse control technique based on the homogenized energy model for hysteresis. The resulting inverse filter is incorporated in an L1 control theory to provide a robust control algorithm capable of providing high speed, high accuracy tracking in the presence of actuator hysteresis and nonlinearities. Properties of the control design are illustrated through numerical examples.

  5. Flexible receiver adapter formal design review

    SciTech Connect

    Krieg, S.A.

    1995-06-13

    This memo summarizes the results of the Formal (90%) Design Review process and meetings held to evaluate the design of the Flexible Receiver Adapters, support platforms, and associated equipment. The equipment is part of the Flexible Receiver System used to remove, transport, and store long length contaminated equipment and components from both the double and single-shell underground storage tanks at the 200 area tank farms.

  6. Using digital electronic design flow to create a Genetic Design Automation tool.

    PubMed

    Gendrault, Y; Madec, M; Wlotzko, V; Andraud, M; Lallement, C; Haiech, J

    2012-01-01

    Synthetic bio-systems become increasingly more complex and their development is lengthy and expensive. In the same way, in microelectronics, the design process of very complex circuits has benefited from many years of experience. It is now partly automated through Electronic Design Automation tools. Both areas present analogies that can be used to create a Genetic Design Automation tool inspired from EDA tools used in digital electronics. This tool would allow moving away from a totally manual design of bio-systems to assisted conception. This ambitious project is presented in this paper, with a deep focus on the tool that automatically generates models of bio-systems directly usable in electronic simulators.

  7. Adaptive Automation and Cue Invocation: The Effect of Cue Timing on Operator Error

    DTIC Science & Technology

    2013-05-01

    129. 5. Parasuraman, R. (2000). Designing automation for human use: Empirical studies and quantitative models. Ergonomics , 43, 931-951. 6...Prospective memory errors involve memory for intended actions that are planned to be performed at some designated point in the future [20]. In the DMOO...RESCHU) [21] was used in this study. A Navy pilot who is familiar with supervisory control tasks designed the RESCHU task and the task has been

  8. A Model for Designing Adaptive Laboratory Evolution Experiments.

    PubMed

    LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M

    2017-04-15

    The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10(-6.9) to 10(-8.4) mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique.IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized

  9. An automated approach to magnetic divertor configuration design

    NASA Astrophysics Data System (ADS)

    Blommaert, M.; Dekeyser, W.; Baelmans, M.; Gauger, N. R.; Reiter, D.

    2015-01-01

    Automated methods based on optimization can greatly assist computational engineering design in many areas. In this paper an optimization approach to the magnetic design of a nuclear fusion reactor divertor is proposed and applied to a tokamak edge magnetic configuration in a first feasibility study. The approach is based on reduced models for magnetic field and plasma edge, which are integrated with a grid generator into one sensitivity code. The design objective chosen here for demonstrative purposes is to spread the divertor target heat load as much as possible over the entire target area. Constraints on the separatrix position are introduced to eliminate physically irrelevant magnetic field configurations during the optimization cycle. A gradient projection method is used to ensure stable cost function evaluations during optimization. The concept is applied to a configuration with typical Joint European Torus (JET) parameters and it automatically provides plausible configurations with reduced heat load.

  10. Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning

    ERIC Educational Resources Information Center

    Thomas, M. S.; Kothari, D. P.; Prakash, A.

    2011-01-01

    Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…

  11. Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi

    2013-03-01

    Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.

  12. Numerical design of an adaptive aileron

    NASA Astrophysics Data System (ADS)

    Amendola, Gianluca; Dimino, Ignazio; Concilio, Antonio; Magnifico, Marco; Pecora, Rosario

    2016-04-01

    The study herein described is aimed at investigating the feasibility of an innovative full-scale camber morphing aileron device. In the framework of the "Adaptive Aileron" project, an international cooperation between Italy and Canada, this goal was carried out with the integration of different morphing concepts in a wing-tip prototype. As widely demonstrated in recent European projects such as Clean Sky JTI and SARISTU, wing trailing edge morphing may lead to significant drag reduction (up to 6%) in off-design flight points by adapting chord-wise camber variations in cruise to compensate A/C weight reduction following fuel consumption. Those researches focused on the flap region as the most immediate solution to implement structural adaptations. However, there is also a growing interest in extending morphing functionalities to the aileron region preserving its main functionality in controlling aircraft directional stability. In fact, the external region of the wing seems to be the most effective in producing "lift over drag" improvements by morphing. Thus, the objective of the presented research is to achieve a certain drag reduction in off-design flight points by adapting wing shape and lift distribution following static deflections. In perspective, the developed device could also be used as a load alleviation system to reduce gust effects, augmenting its frequency bandwidth. In this paper, the preliminary design of the adaptive aileron is first presented, assessed on the base of the external aerodynamic loads. The primary structure is made of 5 segmented ribs, distributed along 4 bays, each splitted into three consecutive parts, connected with spanwise stringers. The aileron shape modification is then implemented by means of an actuation system, based on a classical quick-return mechanism, opportunely suited for the presented application. Finite element analyses were assessed for properly sizing the load-bearing structure and actuation systems and for

  13. The automated strength-aeroelastic design of aerospace structures program

    NASA Technical Reports Server (NTRS)

    Johnson, E. H.; Venkayya, V. B.

    1984-01-01

    An ongoing program whose goal is to develop an automated procedure that can assist in the preliminary design of aircraft and space structures is described. The approach and capabilities that are to be included in the final procedures are descussed. By using proven engineering software as a basis for the project, a reliable and interdisciplinary procedure is developed. The use of a control language for module sequencing and execution permits efficient development of the procedure and gives the user significant flexibility in altering or enhancing the procedure. The data base system provides reliable and efficient access to the large amounts of interrelated data required in an enterprise of this sort. In addition, the data base allows interfacing with existing pre- and post-processors in an almost trivial manner. Altogether, the procedure promises to be of considerable utility to preliminary structural design teams.

  14. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  15. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  16. Automated Microscopy: Macro Language Controlling a Confocal Microscope and its External Illumination: Adaptation for Photosynthetic Organisms.

    PubMed

    Steinbach, Gábor; Kaňa, Radek

    2016-04-01

    Photosynthesis research employs several biophysical methods, including the detection of fluorescence. Even though fluorescence is a key method to detect photosynthetic efficiency, it has not been applied/adapted to single-cell confocal microscopy measurements to examine photosynthetic microorganisms. Experiments with photosynthetic cells may require automation to perform a large number of measurements with different parameters, especially concerning light conditions. However, commercial microscopes support custom protocols (through Time Controller offered by Olympus or Experiment Designer offered by Zeiss) that are often unable to provide special set-ups and connection to external devices (e.g., for irradiation). Our new system combining an Arduino microcontroller with the Cell⊕Finder software was developed for controlling Olympus FV1000 and FV1200 confocal microscopes and the attached hardware modules. Our software/hardware solution offers (1) a text file-based macro language to control the imaging functions of the microscope; (2) programmable control of several external hardware devices (light sources, thermal controllers, actuators) during imaging via the Arduino microcontroller; (3) the Cell⊕Finder software with ergonomic user environment, a fast selection method for the biologically important cells and precise positioning feature that reduces unwanted bleaching of the cells by the scanning laser. Cell⊕Finder can be downloaded from http://www.alga.cz/cellfinder. The system was applied to study changes in fluorescence intensity in Synechocystis sp. PCC6803 cells under long-term illumination. Thus, we were able to describe the kinetics of phycobilisome decoupling. Microscopy data showed that phycobilisome decoupling appears slowly after long-term (>1 h) exposure to high light.

  17. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    NASA Technical Reports Server (NTRS)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  18. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  19. The design of an automated electrolytic enrichment apparatus for tritium

    SciTech Connect

    Myers, J.L.

    1994-12-01

    The Radiation Analytical Sciences Section at Laboratory at Lawrence Livermore National Laboratory performs analysis of low-level tritium concentrations in various natural water samples from the Tri-Valley Area, DOE Nevada Test Site, Site 300 in Tracy, CA, and other various places around the world. Low levels of tritium, a radioactive isotope of hydrogen, which is pre-concentrated in the RAS laboratory using an electrolytic enrichment apparatus. Later these enriched waters are analyzed by liquid scintillation counting to determine the activity of tritium. The enrichment procedure and the subsequent purification process by vacuum distillation are currently undertaken manually, hence being highly labor-intensive. The whole process typically takes about 2 to 3 weeks to complete a batch of 30 samples, with a dedicated personnel operating the process. The goal is to automate the entire process, specifically having the operation PC-LabVIEW{trademark} controlled with real-time monitoring capability. My involvement was in the design and fabrication of a prototypical automated electrolytic enrichment cell. Work will be done on optimizing the electrolytic process by assessing the different parameters of the enrichment procedure. Hardware and software development have also been an integral component of this project.

  20. Automated Design of Restraint Layer of an Inflatable Vessel

    NASA Technical Reports Server (NTRS)

    Spexarth, Gary

    2007-01-01

    A Mathcad computer program largely automates the design and analysis of the restraint layer (the primary load-bearing layer) of an inflatable vessel that consists of one or more sections having cylindrical, toroidal, and/or spherical shape(s). A restraint layer typically comprises webbing in the form of multiple straps. The design task includes choosing indexing locations along the straps, computing the load at every location in each strap, computing the resulting stretch at each location, and computing the amount of undersizing required of each strap so that, once the vessel is inflated and the straps thus stretched, the vessel can be expected to assume the desired shape. Prior to the development of this program, the design task was performed by use of a difficult-to-use spreadsheet program that required manual addition of rows and columns depending on the numbers of strap rows and columns of a given design. In contrast, this program is completely parametric and includes logic that automatically adds or deletes rows and columns as needed. With minimal input from the user, this program automatically computes indexing locations, strap lengths, undersizing requirements, and all design data required to produce detailed drawings and assembly procedures. It also generates textual comments that help the user understand the calculations.

  1. Expert system approach to design an automated guided vehicle

    NASA Astrophysics Data System (ADS)

    Kumaraguru, Karthikeyan; Hall, Ernest L.

    1998-10-01

    The purpose of this paper is to describe an expert system to design the base of an automated guided vehicle. The components of the expert system include: (1) A user-friendly graphic user interface, where the user can enter specifications--like the environment used, application of the robot, etc.; (2) An engine that converts the managerial requirements into technical parameters and designs the robot--initially assuming some parameters and confirming its assumptions during the course of the design; when unable to do so, it iterates with different assumptions until they are met; the code also selects various materials to be used from a corresponding database; (3) A database of various materials from their manufacturers/suppliers; (4) The output data is interfaced with a CAD engine, which generates a 3D solid model of the vehicle; and (5) A `Bill of Materials' file is generated as the output and suggestions for how to assemble them are given. The method has been tested by designing a small mobile robot. The software provides an excellent tool to develop a mobile robot based on performance specifications. Modeling helps the user understand the constraints on the design of the robot and the bill of materials--along with the vendor address, helps the user buy the components needed to assemble the robot.

  2. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  3. Adaptive strategies for materials design using uncertainties

    SciTech Connect

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  4. Adaptive strategies for materials design using uncertainties

    DOE PAGES

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; ...

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  5. Adaptive Strategies for Materials Design using Uncertainties

    PubMed Central

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties. PMID:26792532

  6. Adaptive Mesh Refinement for Microelectronic Device Design

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Lou, John; Norton, Charles

    1999-01-01

    Finite element and finite volume methods are used in a variety of design simulations when it is necessary to compute fields throughout regions that contain varying materials or geometry. Convergence of the simulation can be assessed by uniformly increasing the mesh density until an observable quantity stabilizes. Depending on the electrical size of the problem, uniform refinement of the mesh may be computationally infeasible due to memory limitations. Similarly, depending on the geometric complexity of the object being modeled, uniform refinement can be inefficient since regions that do not need refinement add to the computational expense. In either case, convergence to the correct (measured) solution is not guaranteed. Adaptive mesh refinement methods attempt to selectively refine the region of the mesh that is estimated to contain proportionally higher solution errors. The refinement may be obtained by decreasing the element size (h-refinement), by increasing the order of the element (p-refinement) or by a combination of the two (h-p refinement). A successful adaptive strategy refines the mesh to produce an accurate solution measured against the correct fields without undue computational expense. This is accomplished by the use of a) reliable a posteriori error estimates, b) hierarchal elements, and c) automatic adaptive mesh generation. Adaptive methods are also useful when problems with multi-scale field variations are encountered. These occur in active electronic devices that have thin doped layers and also when mixed physics is used in the calculation. The mesh needs to be fine at and near the thin layer to capture rapid field or charge variations, but can coarsen away from these layers where field variations smoothen and charge densities are uniform. This poster will present an adaptive mesh refinement package that runs on parallel computers and is applied to specific microelectronic device simulations. Passive sensors that operate in the infrared portion of

  7. Automated design of multiphase space missions using hybrid optimal control

    NASA Astrophysics Data System (ADS)

    Chilan, Christian Miguel

    A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving the associated continuous optimal control problem. This strategy, however, will most likely yield a sub-optimal solution, as the problem is sophisticated for several reasons. For example, the number of events in the optimal mission structure is not known a priori and the system equations of motion change depending on what event is current. In this work a framework for the automated design of multiphase space missions is presented using hybrid optimal control (HOC). The method developed uses two nested loops: an outer-loop that handles the discrete dynamics and finds the optimal mission structure in terms of the categorical variables, and an inner-loop that performs the optimization of the corresponding continuous-time dynamical system and obtains the required control history. Genetic algorithms (GA) and direct transcription with nonlinear programming (NLP) are introduced as methods of solution for the outer-loop and inner-loop problems, respectively. Automation of the inner-loop, continuous optimal control problem solver, required two new technologies. The first is a method for the automated construction of the NLP problems resulting from the use of a direct solver for systems with different structures, including different numbers of categorical events. The method assembles modules, consisting of parameters and constraints appropriate to each event, sequentially according to the given mission structure. The other new technology is for a robust initial guess generator required by the inner-loop NLP problem solver. Two new methods were developed for cases including low-thrust trajectories. The first method, based on GA

  8. A Tutorial on Adaptive Design Optimization

    PubMed Central

    Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.

    2013-01-01

    Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275

  9. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems.

  10. Effects of extended lay-off periods on performance and operator trust under adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-03-01

    Little is known about the long-term effects of system reliability when operators do not use a system during an extended lay-off period. To examine threats to skill maintenance, 28 participants operated twice a simulation of a complex process control system for 2.5 h, with an 8-month retention interval between sessions. Operators were provided with an adaptable support system, which operated at one of the following reliability levels: 60%, 80% or 100%. Results showed that performance, workload, and trust remained stable at the second testing session, but operators lost self-confidence in their system management abilities. Finally, the effects of system reliability observed at the first testing session were largely found again at the second session. The findings overall suggest that adaptable automation may be a promising means to support operators in maintaining their performance at the second testing session.

  11. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  12. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers

    NASA Astrophysics Data System (ADS)

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C.; Markley, John L.

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-13C, U-15N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D 1H-15N and 1H-13C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of 1H, 13C, and 15N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use.

  13. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers.

    PubMed

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C; Markley, John L

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-(13)C, U-(15)N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D (1)H-(15)N and (1)H-(13)C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of (1)H, (13)C, and (15)N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use.

  14. Dual adaptive control: Design principles and applications

    NASA Technical Reports Server (NTRS)

    Mookerjee, Purusottam

    1988-01-01

    The design of an actively adaptive dual controller based on an approximation of the stochastic dynamic programming equation for a multi-step horizon is presented. A dual controller that can enhance identification of the system while controlling it at the same time is derived for multi-dimensional problems. This dual controller uses sensitivity functions of the expected future cost with respect to the parameter uncertainties. A passively adaptive cautious controller and the actively adaptive dual controller are examined. In many instances, the cautious controller is seen to turn off while the latter avoids the turn-off of the control and the slow convergence of the parameter estimates, characteristic of the cautious controller. The algorithms have been applied to a multi-variable static model which represents a simplified linear version of the relationship between the vibration output and the higher harmonic control input for a helicopter. Monte Carlo comparisons based on parametric and nonparametric statistical analysis indicate the superiority of the dual controller over the baseline controller.

  15. Adaptive optical antennas: design and evaluation

    NASA Astrophysics Data System (ADS)

    Weyrauch, Thomas; Vorontsov, Mikhail A.; Carhart, Gary W.; Simonova, Galina V.; Beresnev, Leonid A.; Polnau, Ernst E.

    2007-09-01

    We present the design and evaluation of compact adaptive optical antennas with apertures diameters of 16 mm and 100 mm for 5Gbit/s-class free-space optical communication systems. The antennas provide a bi-directional optically transparent link between fiber-optical wavelength-division multiplex systems and allow for mitigation of atmospheric-turbulence induced wavefront phase distortions with adaptive optics components. Beam steering is implemented in the antennas either with mirrors on novel tip/tilt platforms or a fiber-tip positioning system, both enabling operation bandwidths of more than 1 kHz. Bimorph piezoelectric actuated deformable mirrors are used for low-order phase-distortion compensation. An imaging system is integrated in the antennas for coarse pointing and tracking. Beam steering and wavefront control is based on blind maximization of the received signal level using a stochastic parallel gradient descent algorithm. The adaptive optics control architecture allowed the use of feedback signals provided locally within each transceiver system and remotely by the opposite transceiver system via an RF link. First atmospheric compensation results from communication experiments over a 250 m near-ground propagation path are presented.

  16. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  17. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  18. Valuation of design adaptability in aerospace systems

    NASA Astrophysics Data System (ADS)

    Fernandez Martin, Ismael

    As more information is brought into early stages of the design, more pressure is put on engineers to produce a reliable, high quality, and financially sustainable product. Unfortunately, requirements established at the beginning of a new project by customers, and the environment that surrounds them, continue to change in some unpredictable ways. The risk of designing a system that may become obsolete during early stages of production is currently tackled by the use of robust design simulation, a method that allows to simultaneously explore a plethora of design alternatives and requirements with the intention of accounting for uncertain factors in the future. Whereas this design technique has proven to be quite an improvement in design methods, under certain conditions, it fails to account for the change of uncertainty over time and the intrinsic value embedded in the system when certain design features are activated. This thesis introduces the concepts of adaptability and real options to manage risk foreseen in the face of uncertainty at early design stages. The method described herein allows decision-makers to foresee the financial impact of their decisions at the design level, as well as the final exposure to risk. In this thesis, cash flow models, traditionally used to obtain the forecast of a project's value over the years, were replaced with surrogate models that are capable of showing fluctuations on value every few days. This allowed a better implementation of real options valuation, optimization, and strategy selection. Through the option analysis model, an optimization exercise allows the user to obtain the best implementation strategy in the face of uncertainty as well as the overall value of the design feature. Here implementation strategy refers to the decision to include a new design feature in the system, after the design has been finalized, but before the end of its production life. The ability to do this in a cost efficient manner after the system

  19. Adaptive enrichment designs for clinical trials.

    PubMed

    Simon, Noah; Simon, Richard

    2013-09-01

    Modern medicine has graduated from broad spectrum treatments to targeted therapeutics. New drugs recognize the recently discovered heterogeneity of many diseases previously considered to be fairly homogeneous. These treatments attack specific genetic pathways which are only dysregulated in some smaller subset of patients with the disease. Often this subset is only rudimentarily understood until well into large-scale clinical trials. As such, standard practice has been to enroll a broad range of patients and run post hoc subset analysis to determine those who may particularly benefit. This unnecessarily exposes many patients to hazardous side effects, and may vastly decrease the efficiency of the trial (especially if only a small subset of patients benefit). In this manuscript, we propose a class of adaptive enrichment designs that allow the eligibility criteria of a trial to be adaptively updated during the trial, restricting entry to patients likely to benefit from the new treatment. We show that our designs both preserve the type 1 error, and in a variety of cases provide a substantial increase in power.

  20. The automated design of materials far from equilibrium

    NASA Astrophysics Data System (ADS)

    Miskin, Marc Z.

    Automated design is emerging as a powerful concept in materials science. By combining computer algorithms, simulations, and experimental data, new techniques are being developed that start with high level functional requirements and identify the ideal materials that achieve them. This represents a radically different picture of how materials become functional in which technological demand drives material discovery, rather than the other way around. At the frontiers of this field, materials systems previously considered too complicated can start to be controlled and understood. Particularly promising are materials far from equilibrium. Material robustness, high strength, self-healing and memory are properties displayed by several materials systems that are intrinsically out of equilibrium. These and other properties could be revolutionary, provided they can first be controlled. This thesis conceptualizes and implements a framework for designing materials that are far from equilibrium. We show how, even in the absence of a complete physical theory, design from the top down is possible and lends itself to producing physical insight. As a prototype system, we work with granular materials: collections of athermal, macroscopic identical objects, since these materials function both as an essential component of industrial processes as well as a model system for many non-equilibrium states of matter. We show that by placing granular materials in the context of design, benefits emerge simultaneously for fundamental and applied interests. As first steps, we use our framework to design granular aggregates with extreme properties like high stiffness, and softness. We demonstrate control over nonlinear effects by producing exotic aggregates that stiffen under compression. Expanding on our framework, we conceptualize new ways of thinking about material design when automatic discovery is possible. We show how to build rules that link particle shapes to arbitrary granular packing

  1. Design of a miniaturized solid state laser for automated assembly

    NASA Astrophysics Data System (ADS)

    Funck, Max C.; Dolkemeyer, Jan; Morasch, Valentin; Loosen, Peter

    2010-05-01

    A miniaturized solid state laser for marking applications has been developed featuring novel assembly strategies to reduce size, cost and assembly effort. Design and setup have been laid out with future automation of the assembly in mind. Using a high precision robot the optical components composing the laser system are directly placed on a planar substrate providing accurate positioning and alignment within a few microns. No adjustable mounts for mirrors and lenses are necessary, greatly simplifying the setup. Consisting of either a ND:YAG or a Nd:YVO4 crystal pumped with a fiber coupled diode laser, a q-switch for pulse generation and a beam expander the entire assembly is confined in a 100ml space and delivers 4 W of continuous output power at 1.064 μm with an efficiency greater than 40%. Pulse lengths of 10-20 ns and repetition rates of up to 150 kHz have been obtained with an acousto-optic modulator. In addition, a custom designed electro-optic modulator with integrated high voltage switch has been realized. A supply unit for the entire system, including scanner and water cooling, is integrated in a 19" industrial chassis and can be operated via a graphical user interface on a standard personal computer.

  2. Matters concerned with designing distributed systems for automated control of electrical equipment at power stations

    NASA Astrophysics Data System (ADS)

    Gorozhankin, P. A.; Krasnova, M. E.

    2011-10-01

    Matters concerned with developing the working designs of systems for automated control of electrical equipment are discussed. Basic technical requirements for computerized automation facilities are formulated from the viewpoint of ensuring the required scope of functions and fault tolerance, and proposals for the layout and placement of these facilities are suggested. A special section devoted to protection of automated process control systems from computer viruses is given.

  3. HIV-1 vaccines and adaptive trial designs.

    PubMed

    Corey, Lawrence; Nabel, Gary J; Dieffenbach, Carl; Gilbert, Peter; Haynes, Barton F; Johnston, Margaret; Kublin, James; Lane, H Clifford; Pantaleo, Giuseppe; Picker, Louis J; Fauci, Anthony S

    2011-04-20

    Developing a vaccine against the human immunodeficiency virus (HIV) poses an exceptional challenge. There are no documented cases of immune-mediated clearance of HIV from an infected individual, and no known correlates of immune protection. Although nonhuman primate models of lentivirus infection have provided valuable data about HIV pathogenesis, such models do not predict HIV vaccine efficacy in humans. The combined lack of a predictive animal model and undefined biomarkers of immune protection against HIV necessitate that vaccines to this pathogen be tested directly in clinical trials. Adaptive clinical trial designs can accelerate vaccine development by rapidly screening out poor vaccines while extending the evaluation of efficacious ones, improving the characterization of promising vaccine candidates and the identification of correlates of immune protection.

  4. Automated Design of Noise-Minimal, Safe Rotorcraft Trajectories

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Venable, K. Brent; Lindsay, James

    2012-01-01

    NASA and the international community are investing in the development of a commercial transportation infrastructure that includes the increased use of rotorcraft, specifically helicopters and aircraft such as a 40-passenger civil tilt rotors. Rotorcraft have a number of advantages over fixed wing aircraft, primarily in not requiring direct access to the primary fixed wing runways. As such they can operate at an airport without directly interfering with major air carrier and commuter aircraft operations. However, there is significant concern over the impact of noise on the communities surrounding the transportation facilities. In this paper we propose to address the rotorcraft noise problem by exploiting powerful search techniques coming from artificial intelligence, coupled with simulation and field tests, to design trajectories that are expected to improve on the amount of ground noise generated. This paper investigates the use of simulation based on predictive physical models to facilitate the search for low-noise trajectories using a class of automated search algorithms called local search. A novel feature of this approach is the ability to incorporate constraints into the problem formulation that addresses passenger safety and comfort.

  5. Design and implementation of an automated compound management system in support of lead optimization.

    PubMed

    Quintero, Catherine; Kariv, Ilona

    2009-06-01

    To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process.

  6. Stable adaptive control using new critic designs

    NASA Astrophysics Data System (ADS)

    Werbos, Paul J.

    1999-03-01

    Classical adaptive control proves total-system stability for control of linear plants, but only for plants meeting very restrictive assumptions. Approximate Dynamic Programming (ADP) has the potential, in principle, to ensure stability without such tight restrictions. It also offers nonlinear and neural extensions for optimal control, with empirically supported links to what is seen in the brain. However, the relevant ADP methods in use today--TD, HDP, DHP, GDHP--and the Galerkin-based versions of these all have serious limitations when used here as parallel distributed real-time learning systems; either they do not possess quadratic unconditional stability (to be defined) or they lead to incorrect results in the stochastic case. (ADAC or Q- learning designs do not help.) After explaining these conclusions, this paper describes new ADP designs which overcome these limitations. It also addresses the Generalized Moving Target problem, a common family of static optimization problems, and describes a way to stabilize large-scale economic equilibrium models, such as the old long-term energy mode of DOE.

  7. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  8. Some challenges with statistical inference in adaptive designs.

    PubMed

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  9. Automated systems for creative processes in scientific research, design, and robotics

    SciTech Connect

    Glushkov, V.M.; Stognii, A.A.; Biba, I.G.; Vashchenko, N.D.; Galagan, N.I.; Gladun, V.P.; Rabinovich, Z.L.; Sakunov, I.A.; Khomenko, L.V.

    1981-11-01

    The authors give a general description of software that was developed to automate the creative processes in scientific research, design and robotics. The systems APROS, SSP, Analizator-ES and Analizator are discussed. 12 references.

  10. Designing and Implementing Effective Adapted Physical Education Programs

    ERIC Educational Resources Information Center

    Kelly, Luke E.

    2011-01-01

    "Designing and Implementing Effective Adapted Physical Education Programs" was written to assist adapted and general physical educators who are dedicated to ensuring that the physical and motor needs of all their students are addressed in physical education. While it is anticipated that adapted physical educators, where available, will typically…

  11. Design and Implementation of an Open, Interoperable AutomatedDemand Response Infrastructure

    SciTech Connect

    Piette, Mary Ann; Kiliccote, Sila; Ghatikar, Girish

    2007-10-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automating demand response (DR). Automating DR allows greater levels of participation and improved reliability and repeatability of the demand response and customer facilities. Automated DR systems have been deployed for critical peak pricing and demand bidding and are being designed for real time pricing. The system is designed to generate, manage, and track DR signals between utilities and Independent System Operators (ISOs) to aggregators and end-use customers and their control systems.

  12. An Automated Reference Frame Selection (ARFS) Algorithm for Cone Imaging with Adaptive Optics Scanning Light Ophthalmoscopy

    PubMed Central

    Salmon, Alexander E.; Cooper, Robert F.; Langlo, Christopher S.; Baghaie, Ahmadreza; Dubra, Alfredo; Carroll, Joseph

    2017-01-01

    Purpose To develop an automated reference frame selection (ARFS) algorithm to replace the subjective approach of manually selecting reference frames for processing adaptive optics scanning light ophthalmoscope (AOSLO) videos of cone photoreceptors. Methods Relative distortion was measured within individual frames before conducting image-based motion tracking and sorting of frames into distinct spatial clusters. AOSLO images from nine healthy subjects were processed using ARFS and human-derived reference frames, then aligned to undistorted AO-flood images by nonlinear registration and the registration transformations were compared. The frequency at which humans selected reference frames that were rejected by ARFS was calculated in 35 datasets from healthy subjects, and subjects with achromatopsia, albinism, or retinitis pigmentosa. The level of distortion in this set of human-derived reference frames was assessed. Results The average transformation vector magnitude required for registration of AOSLO images to AO-flood images was significantly reduced from 3.33 ± 1.61 pixels when using manual reference frame selection to 2.75 ± 1.60 pixels (mean ± SD) when using ARFS (P = 0.0016). Between 5.16% and 39.22% of human-derived frames were rejected by ARFS. Only 2.71% to 7.73% of human-derived frames were ranked in the top 5% of least distorted frames. Conclusion ARFS outperforms expert observers in selecting minimally distorted reference frames in AOSLO image sequences. The low success rate in human frame choice illustrates the difficulty in subjectively assessing image distortion. Translational Relevance Manual reference frame selection represented a significant barrier to a fully automated image-processing pipeline (including montaging, cone identification, and metric extraction). The approach presented here will aid in the clinical translation of AOSLO imaging. PMID:28392976

  13. Design for an Adaptive Library Catalog.

    ERIC Educational Resources Information Center

    Buckland, Michael K.; And Others

    1992-01-01

    Describes OASIS, a prototype adaptive online catalog implemented as a front end to the University of California MELVYL catalog. Topics addressed include the concept of adaptive retrieval systems, strategic search commands, feedback, prototyping using a front-end, the problem of excessive retrieval, commands to limit or increase search results, and…

  14. Designing and Generating Educational Adaptive Hypermedia Applications

    ERIC Educational Resources Information Center

    Retalis, Symeon; Papasalouros, Andreas

    2005-01-01

    Educational Adaptive Hypermedia Applications (EAHA) provide personalized views on the learning content to individual learners. They also offer adaptive sequencing (navigation) over the learning content based on rules that stem from the user model requirements and the instructional strategies. EAHA are gaining the focus of the research community as…

  15. Author Support for the Design of Automated Medical Interviews

    PubMed Central

    Maccabe, A.B.; Underwood, W.E.; Brunjes, Shannon

    1979-01-01

    This paper describes a prototype system that provides interactive author support for an automated medical interviewing system. An on-line users manual enables health care professionals to use the system without prior knowledge or experience. The approach taken was to make the author support programs interviews in the underlying interviewing system.

  16. Economics of automation for the design-to-mask interface

    NASA Astrophysics Data System (ADS)

    Erck, Wesley

    2009-04-01

    Mask order automation has increased steadily over the years through a variety of individual mask customer implementations. These have been supported by customer-specific software at the mask suppliers to support the variety of customer output formats. Some customers use the SEMI P10 1 standard, some use supplier-specific formats, and some use customer-specific formats. Some customers use little automation and depend instead on close customer-supplier relationships. Implementations are varied in quality and effectiveness. A major factor which has prolonged the adoption of more advanced and effective solutions has been a lack of understanding of the economic benefits. Some customers think standardized automation mainly benefits the mask supplier in order entry automation, but this ignores a number of other significant benefits which differ dramatically for each party in the supply chain. This paper discusses the nature of those differing advantages and presents simple models suited to four business cases: integrated device manufacturers (IDM), fabless companies, foundries and mask suppliers. Examples and estimates of the financial advantages for these business types will be shown.

  17. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  18. Teacher-Led Design of an Adaptive Learning Environment

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis; Kalles, Dimitris; Gregoriades, Andreas

    2016-01-01

    This paper discusses a requirements engineering process that exemplifies teacher-led design in the case of an envisioned system for adaptive learning. Such a design poses various challenges and still remains an open research issue in the field of adaptive learning. Starting from a scenario-based elicitation method, the whole process was highly…

  19. Robust design of configurations and parameters of adaptable products

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Chen, Yongliang; Xue, Deyi; Gu, Peihua

    2014-03-01

    An adaptable product can satisfy different customer requirements by changing its configuration and parameter values during the operation stage. Design of adaptable products aims at reducing the environment impact through replacement of multiple different products with single adaptable ones. Due to the complex architecture, multiple functional requirements, and changes of product configurations and parameter values in operation, impact of uncertainties to the functional performance measures needs to be considered in design of adaptable products. In this paper, a robust design approach is introduced to identify the optimal design configuration and parameters of an adaptable product whose functional performance measures are the least sensitive to uncertainties. An adaptable product in this paper is modeled by both configurations and parameters. At the configuration level, methods to model different product configuration candidates in design and different product configuration states in operation to satisfy design requirements are introduced. At the parameter level, four types of product/operating parameters and relations among these parameters are discussed. A two-level optimization approach is developed to identify the optimal design configuration and its parameter values of the adaptable product. A case study is implemented to illustrate the effectiveness of the newly developed robust adaptable design method.

  20. A Testlet Assembly Design for Adaptive Multistage Tests

    ERIC Educational Resources Information Center

    Luecht, Richard; Brumfield, Terry; Breithaupt, Krista

    2006-01-01

    This article describes multistage tests and some practical test development considerations related to the design and implementation of a multistage test, using the Uniform CPA (certified public accountant) Examination as a case study. The article further discusses the use of automated test assembly procedures in an operational context to produce…

  1. The Application of a General Purpose Data Base Management System to Design Automation.

    DTIC Science & Technology

    1983-12-01

    unrelated to the computer industry. One instance of a CAD system is the Structured Computer Aided Logic Design (SCALD) system, developed out of Lawrence... Structured Computer Aided Logic Design", 15th Design Automation Conference Proceedings, 1978, Las Vegas, Nevada. 3. Sherlock, Barbara J., User

  2. The use of adaptable automation: Effects of extended skill lay-off and changes in system reliability.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain

    2017-01-01

    This experiment aimed to examine how skill lay-off and system reliability would affect operator behaviour in a simulated work environment under wide-range and large-choice adaptable automation comprising six different levels. Twenty-four participants were tested twice during a 2-hr testing session, with the second session taking place 8 months after the first. In the middle of the second testing session, system reliability changed. The results showed that after the retention interval trust increased and self-confidence decreased. Complacency was unaffected by the lay-off period. Diagnostic speed slowed down after the retention interval but diagnostic accuracy was maintained. No difference between experimental conditions was found for automation management behaviour (i.e. level of automation chosen and frequency of switching between levels). There were few effects of system reliability. Overall, the findings showed that subjective measures were more sensitive to the impact of skill lay-off than objective behavioural measures.

  3. Situation Awareness Implications of Adaptive Automation of Air Traffic Controller Information Processing Functions

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa

    2004-01-01

    The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the

  4. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  5. A framework for automated contour quality assurance in radiation therapy including adaptive techniques

    NASA Astrophysics Data System (ADS)

    Altman, M. B.; Kavanaugh, J. A.; Wooten, H. O.; Green, O. L.; DeWees, T. A.; Gay, H.; Thorstad, W. L.; Li, H.; Mutic, S.

    2015-07-01

    Contouring of targets and normal tissues is one of the largest sources of variability in radiation therapy treatment plans. Contours thus require a time intensive and error-prone quality assurance (QA) evaluation, limitations which also impair the facilitation of adaptive radiotherapy (ART). Here, an automated system for contour QA is developed using historical data (the ‘knowledge base’). A pilot study was performed with a knowledge base derived from 9 contours each from 29 head-and-neck treatment plans. Size, shape, relative position, and other clinically-relevant metrics and heuristically derived rules are determined. Metrics are extracted from input patient data and compared against rules determined from the knowledge base; a computer-learning component allows metrics to evolve with more input data, including patient specific data for ART. Nine additional plans containing 42 unique contouring errors were analyzed. 40/42 errors were detected as were 9 false positives. The results of this study imply knowledge-based contour QA could potentially enhance the safety and effectiveness of RT treatment plans as well as increase the efficiency of the treatment planning process, reducing labor and the cost of therapy for patients.

  6. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    SciTech Connect

    Adal, Kedir M.; Sidebe, Desire; Ali, Sharib; Chaum, Edward; Karnowski, Thomas Paul; Meriaudeau, Fabrice

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  7. Adaptive Design of Confirmatory Trials: Advances and Challenges

    PubMed Central

    Lai, Tze Leung; Lavori, Philip W.; Tsang, Ka Wai

    2015-01-01

    The past decade witnessed major developments in innovative designs of confirmatory clinical trials, and adaptive designs represent the most active area of these developments. We give an overview of the developments and associated statistical methods in several classes of adaptive designs of confirmatory trials. We also discuss their statistical difficulties and implementation challenges, and show how these problems are connected to other branches of mainstream Statistics, which we then apply to resolve the difficulties and bypass the bottlenecks in the development of adaptive designs for the next decade. PMID:26079372

  8. Design for the automation of composite wind turbine blade manufacture

    NASA Astrophysics Data System (ADS)

    Polcari, M. J.; White, K. D.; Sherwood, J. A.

    2016-10-01

    The majority of large wind turbine blades are manufactured from textile-reinforced resin-infused composites using an open mold. The placement of the textile reinforcements in the mold is traditionally accomplished by a manual process where dozens of workers hand place each dry fabric in the mold. Depending on the level of skill and experience of each worker and the relative complexity of the mold geometry, local areas may exhibit out-of-plane wrinkling and in-plane waviness. Fabric imperfections such as these can adversely impact the strength and stiffness of the blade, thereby compromising its durability in service. In an effort to reduce the variabilities associated with a manual-labor process, an automated piecewise shifting method has been proposed for fabric placement. This automated layup method saves time on the preform process and reduces variability from blade to blade. In the current research the automated shifting layup method is investigated using a robust and easy-to-use finite element modelling approach. User-defined material models utilizing a mesoscopic unit-cell modeling approach are linked with Abaqus to capture the evolution of the fabric shear stiffness and changes in the fiber orientations during the fabric-placement process. The simulation approach is demonstrated for the geometry of the trailing edge of a typical wind turbine blade. The simulation considers the mechanical behavior of the fabric and reliably predicts fabric deformation and failure zones.

  9. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  10. Design And Implementation Of A Hierarchical Automated Inspection System

    NASA Astrophysics Data System (ADS)

    Tarbox, Glenn H.; Gerhardt, Lester A.

    1990-02-01

    In order to provide a framework for the evaluation of, and need for, sensor information appropriate to real time manufacturing control, a workcell based on a 5-axis machining center was developed. This workcell defines a problem space within which automated inspection is to be applied. Primarily, we are interested in evaluating the use of machine vision and Coordinate Measuring Machines (CMM's) as means to provide information to an automated workcell controller. This controller will use these sensing technologies in a hierarchical fashion exploiting the speed vs. accuracy tradeoff's characteristic of tactile and non-tactile coordinate acquisition. We have implemented an Octree solid modeling system which has the capabilities of model generation from the information provided by the vision system. In addition, the Octree method lends itself to simulating the actual manufacturing process. Our system reads the machine tool G-Codes generated by our CAD system and simulates the material removal operation by successively removing intersections between the tool and workpiece. This machined model is then used for automatic inspection sequence generation. This paper will describe the framework and architecture of our automated inspection system, as well as specifics relating to the Octree modeling system.

  11. Application of Adaptive Autopilot Designs for an Unmanned Aerial Vehicle

    NASA Technical Reports Server (NTRS)

    Shin, Yoonghyun; Calise, Anthony J.; Motter, Mark A.

    2005-01-01

    This paper summarizes the application of two adaptive approaches to autopilot design, and presents an evaluation and comparison of the two approaches in simulation for an unmanned aerial vehicle. One approach employs two-stage dynamic inversion and the other employs feedback dynamic inversions based on a command augmentation system. Both are augmented with neural network based adaptive elements. The approaches permit adaptation to both parametric uncertainty and unmodeled dynamics, and incorporate a method that permits adaptation during periods of control saturation. Simulation results for an FQM-117B radio controlled miniature aerial vehicle are presented to illustrate the performance of the neural network based adaptation.

  12. Adaptive Automation for Human Supervision of Multiple Uninhabited Vehicles: Effects on Change Detection, Situation Awareness, and Mental Workload

    DTIC Science & Technology

    2009-01-01

    http://www.informaworld.com/smpp/title~content=t775653681 Adaptive Automation for Human Supervision of Multiple Uninhabited Vehicles: Effects on Change...Uninhabited Vehicles: Effects on Change Detection, Situation Awareness, and Mental Workload’,Military Psychology,21:2,270 — 297 To link to this...Supervision of Multiple Uninhabited Vehicles: Effects on Change Detection, Situation Awareness, and Mental Workload 5a. CONTRACT NUMBER 5b. GRANT

  13. Mission Design Evaluation Using Automated Planning for High Resolution Imaging of Dynamic Surface Processes from the ISS

    NASA Technical Reports Server (NTRS)

    Knight, Russell; Donnellan, Andrea; Green, Joseph J.

    2013-01-01

    A challenge for any proposed mission is to demonstrate convincingly that the proposed systems will in fact deliver the science promised. Funding agencies and mission design personnel are becoming ever more skeptical of the abstractions that form the basis of the current state of the practice with respect to approximating science return. To address this, we have been using automated planning and scheduling technology to provide actual coverage campaigns that provide better predictive performance with respect to science return for a given mission design and set of mission objectives given implementation uncertainties. Specifically, we have applied an adaptation of ASPEN and SPICE to the Eagle-Eye domain that demonstrates the performance of the mission design with respect to coverage of science imaging targets that address climate change and disaster response. Eagle-Eye is an Earth-imaging telescope that has been proposed to fly aboard the International Space Station (ISS).

  14. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  15. Design and implementation of Ada programs to facilitate automated testing

    NASA Technical Reports Server (NTRS)

    Dean, Jack; Fox, Barry; Oropeza, Michael

    1991-01-01

    An automated method utilized to test the software components of COMPASS, an interactive computer aided scheduling system, is presented. Each package of this system introduces a private type, and works to construct instances of that type, along with read and write routines for that type. Generic procedures that can generate test drivers for these functions are given and show how the test drivers can read from a test data file the functions to call, the arguments for those functions, what the anticipated result should be, and whether an exception should be raised for the function given the arguments.

  16. Adaptive design lessons from professional architects

    NASA Astrophysics Data System (ADS)

    Geiger, Ray W.; Snell, J. T.

    1993-09-01

    Psychocybernetic systems engineering design conceptualization is mimicking the evolutionary path of habitable environmental design and the professional practice of building architecture, construction, and facilities management. In pursuing better ways to design cellular automata and qualification classifiers in a design process, we have found surprising success in exploring certain more esoteric approaches, e.g., the vision of interdisciplinary artistic discovery in and around creative problem solving. Our evaluation in research into vision and hybrid sensory systems associated with environmental design and human factors has led us to discover very specific connections between the human spirit and quality design. We would like to share those very qualitative and quantitative parameters of engineering design, particularly as it relates to multi-faceted and future oriented design practice. Discussion covers areas of case- based techniques of cognitive ergonomics, natural modeling sources, and an open architectural process of means/goal satisfaction, qualified by natural repetition, gradation, rhythm, contrast, balance, and integrity of process.

  17. Optimal adaptive sequential designs for crossover bioequivalence studies.

    PubMed

    Xu, Jialin; Audet, Charles; DiLiberti, Charles E; Hauck, Walter W; Montague, Timothy H; Parr, Alan F; Potvin, Diane; Schuirmann, Donald J

    2016-01-01

    In prior works, this group demonstrated the feasibility of valid adaptive sequential designs for crossover bioequivalence studies. In this paper, we extend the prior work to optimize adaptive sequential designs over a range of geometric mean test/reference ratios (GMRs) of 70-143% within each of two ranges of intra-subject coefficient of variation (10-30% and 30-55%). These designs also introduce a futility decision for stopping the study after the first stage if there is sufficiently low likelihood of meeting bioequivalence criteria if the second stage were completed, as well as an upper limit on total study size. The optimized designs exhibited substantially improved performance characteristics over our previous adaptive sequential designs. Even though the optimized designs avoided undue inflation of type I error and maintained power at ≥ 80%, their average sample sizes were similar to or less than those of conventional single stage designs.

  18. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  19. Designing Microcomputer Networks (and) LANS: A New Technology to Improve Library Automation.

    ERIC Educational Resources Information Center

    Ivie, Evan L.; Farr, Rick C.

    1984-01-01

    Two articles address the design of microcomputer networks and the use of local area computer networks (LAN) to improve library automation. Topics discussed include network design criteria, media for local networks, transmission mode, typical communication protocols, user interface, basic local network architectures, and examples of microcomputer…

  20. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  1. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-12-19

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives.

  2. Design optimization of system level adaptive optical performance

    NASA Astrophysics Data System (ADS)

    Michels, Gregory J.; Genberg, Victor L.; Doyle, Keith B.; Bisson, Gary R.

    2005-09-01

    By linking predictive methods from multiple engineering disciplines, engineers are able to compute more meaningful predictions of a product's performance. By coupling mechanical and optical predictive techniques mechanical design can be performed to optimize optical performance. This paper demonstrates how mechanical design optimization using system level optical performance can be used in the development of the design of a high precision adaptive optical telescope. While mechanical design parameters are treated as the design variables, the objective function is taken to be the adaptively corrected optical imaging performance of an orbiting two-mirror telescope.

  3. Multidimensional Adaptive Testing with Optimal Design Criteria for Item Selection

    ERIC Educational Resources Information Center

    Mulder, Joris; van der Linden, Wim J.

    2009-01-01

    Several criteria from the optimal design literature are examined for use with item selection in multidimensional adaptive testing. In particular, it is examined what criteria are appropriate for adaptive testing in which all abilities are intentional, some should be considered as a nuisance, or the interest is in the testing of a composite of the…

  4. The design of the automated control system for warehouse equipment under radio-electronic manufacturing

    NASA Astrophysics Data System (ADS)

    Kapulin, D. V.; Chemidov, I. V.; Kazantsev, M. A.

    2017-01-01

    In the paper, the aspects of design, development and implementation of the automated control system for warehousing under the manufacturing process of the radio-electronic enterprise JSC «Radiosvyaz» are discussed. The architecture of the automated control system for warehousing proposed in the paper consists of a server which is connected to the physically separated information networks: the network with a database server, which stores information about the orders for picking, and the network with the automated storage and retrieval system. This principle allows implementing the requirements for differentiation of access, ensuring the information safety and security requirements. Also, the efficiency of the developed automated solutions in terms of optimizing the warehouse’s logistic characteristics is researched.

  5. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  6. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  7. Clothing adaptations: the occupational therapist and the clothing designer collaborate.

    PubMed

    White, L W; Dallas, M J

    1977-02-01

    An occupational therapist and a clothing designer collaborated in solving the dressing problem of a child with multiple amputations. The dressing problems were identified and solutions for clothing adaptations relating to sleeves, closures, fasteners, fit, and design were incorporated into two test garments. Evaluation of the garments was based on ease in dressing and undressing, the effect on movement and mobility, the construction techniques, and their appearance. A description is given of the pattern adjustments, and considerations for clothing adaptations or selection or both are discussed. These clothing adaptations can be generalized to a wider population of handicapped persons.

  8. Automation and Accountability in Decision Support System Interface Design

    ERIC Educational Resources Information Center

    Cummings, Mary L.

    2006-01-01

    When the human element is introduced into decision support system design, entirely new layers of social and ethical issues emerge but are not always recognized as such. This paper discusses those ethical and social impact issues specific to decision support systems and highlights areas that interface designers should consider during design with an…

  9. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  10. Application of the Modular Automated Reconfigurable Assembly System (MARAS) concept to adaptable vision gauging and parts feeding

    NASA Technical Reports Server (NTRS)

    By, Andre Bernard; Caron, Ken; Rothenberg, Michael; Sales, Vic

    1994-01-01

    This paper presents the first phase results of a collaborative effort between university researchers and a flexible assembly systems integrator to implement a comprehensive modular approach to flexible assembly automation. This approach, named MARAS (Modular Automated Reconfigurable Assembly System), has been structured to support multiple levels of modularity in terms of both physical components and system control functions. The initial focus of the MARAS development has been on parts gauging and feeding operations for cylinder lock assembly. This phase is nearing completion and has resulted in the development of a highly configurable system for vision gauging functions on a wide range of small components (2 mm to 100 mm in size). The reconfigurable concepts implemented in this adaptive Vision Gauging Module (VGM) are now being extended to applicable aspects of the singulating, selecting, and orienting functions required for the flexible feeding of similar mechanical components and assemblies.

  11. Rise of the Machines: Automated Laser Guide Star Adaptive Optics Observations of Thousands of Objects with Robo-AO

    NASA Astrophysics Data System (ADS)

    Riddle, Reed L.; Baranec, C.; Law, N. M.; Tendulkar, S. P.; Ramaprakash, A. N.; Kulkarni, S. R.; Dekany, R.; Bui, K.; Burse, M.; Das, H.; Punnadi, S.; Chordia, P.

    2013-01-01

    Robo-AO is the first fully automated laser guide star adaptive optics instrument. Robo-AO has completed thousands of automated AO observations at the visible diffraction limit for several scientific programs during its first semester of science observations. These programs include: the Ultimate Binarity Survey to examine stellar binarity properties across the main sequence and beyond; a survey of 1,000 Kepler objects of interest; the multiplicity of solar type stars; and several programs for high precision astrometric observations. A new infrared camera is under development for Robo-AO, and a clone of the system is in the planning stages. This presentation will discuss the Robo-AO instrument capabilities, summarize the science programs undertaken, and discuss the future of Robo-AO.

  12. Dynamics of adaptive structures: Design through simulations

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alexander, S.

    1993-01-01

    The use of a helical bi-morph actuator/sensor concept by mimicking the change of helical waveform in bacterial flagella is perhaps the first application of bacterial motions (living species) to longitudinal deployment of space structures. However, no dynamical considerations were analyzed to explain the waveform change mechanisms. The objective is to review various deployment concepts from the dynamics point of view and introduce the dynamical considerations from the outset as part of design considerations. Specifically, the impact of the incorporation of the combined static mechanisms and dynamic design considerations on the deployment performance during the reconfiguration stage is studied in terms of improved controllability, maneuvering duration, and joint singularity index. It is shown that intermediate configurations during articulations play an important role for improved joint mechanisms design and overall structural deployability.

  13. Design of microcontroller based system for automation of streak camera

    SciTech Connect

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  14. Design of microcontroller based system for automation of streak camera.

    PubMed

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  15. Modeling biology with HDL languages: a first step toward a genetic design automation tool inspired from microelectronics.

    PubMed

    Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques

    2014-04-01

    Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.

  16. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  17. Automated drafting and design for the Civil engineer

    NASA Astrophysics Data System (ADS)

    Roberts, M. V.

    1981-09-01

    This thesis presents the results of a survey of twelve interactive graphics systems manufacturers and develops a simple model for economic decision analysis. The survey information includes costs of the systems, training times, and software available. From the survey information and extensive literature review, an analysis is developed to determine the feasibility of developing guidelines for each design office to determine whether it could justify automatic drafting. The model is based on: (1) the salaries of design engineers authorized in the design section; (2) the salaries of site developers authorized in the site development section; (3) the amount of time spent on design work and drafting; and (4) the cost of the interactive graphics system used to produce the designs and drawings.

  18. CAMERA: a compact, automated, laser adaptive optics system for small aperture telescopes

    NASA Astrophysics Data System (ADS)

    Britton, Matthew; Velur, Viswa; Law, Nick; Choi, Philip; Penprase, Bryan E.

    2008-07-01

    CAMERA is an autonomous laser guide star adaptive optics system designed for small aperture telescopes. This system is intended to be mounted permanently on such a telescope to provide large amounts of flexibly scheduled observing time, delivering high angular resolution imagery in the visible and near infrared. The design employs a Shack Hartmann wavefront sensor, a 12x12 actuator MEMS device for high order wavefront compensation, and a solid state 355nm ND:YAG laser to generate a guide star. Commercial CCD and InGaAs detectors provide coverage in the visible and near infrared. CAMERA operates by selecting targets from a queue populated by users and executing these observations autonomously. This robotic system is targeted towards applications that are diffcult to address using classical observing strategies: surveys of very large target lists, recurrently scheduled observations, and rapid response followup of transient objects. This system has been designed and costed, and a lab testbed has been developed to evaluate key components and validate autonomous operations.

  19. Optimal Design of Item Banks for Computerized Adaptive Tests.

    ERIC Educational Resources Information Center

    Stocking, Martha L.; Swanson, Len

    1998-01-01

    Applied optimal design methods to the item-bank design of adaptive testing for continuous testing situations using a version of the weighted-deviations model (M. Stocking and L. Swanson, 1993) in a simulation. Independent and overlapping item banks used items more efficiently than did a large item bank. (SLD)

  20. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  1. Design, development, test, and evaluation of an automated analytical electrophoresis apparatus

    NASA Technical Reports Server (NTRS)

    Bartels, P. A.; Bier, M.

    1977-01-01

    An Automated Analytical Electrophoresis Apparatus (AAEA) was designed, developed, assembled, and preliminarily tested. The AAEA was demonstrated to be a feasible apparatus for automatically acquiring, displaying, and storing (and eventually analyzing) electrophoresis mobility data from living blood cells. The apparatus and the operation of its major assemblies are described in detail.

  2. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  3. Designing an Automated Assessment of Public Speaking Skills Using Multimodal Cues

    ERIC Educational Resources Information Center

    Chen, Lei; Feng, Gary; Leong, Chee Wee; Joe, Jilliam; Kitchen, Christopher; Lee, Chong Min

    2016-01-01

    Traditional assessments of public speaking skills rely on human scoring. We report an initial study on the development of an automated scoring model for public speaking performances using multimodal technologies. Task design, rubric development, and human rating were conducted according to standards in educational assessment. An initial corpus of…

  4. Automated design of minimum drag light aircraft fuselages and nacelles

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Fox, S. R.; Karlin, B. E.

    1982-01-01

    The constrained minimization algorithm of Vanderplaats is applied to the problem of designing minimum drag faired bodies such as fuselages and nacelles. Body drag is computed by a variation of the Hess-Smith code. This variation includes a boundary layer computation. The encased payload provides arbitrary geometric constraints, specified a priori by the designer, below which the fairing cannot shrink. The optimization may include engine cooling air flows entering and exhausting through specific port locations on the body.

  5. A Case Study of Reverse Engineering Integrated in an Automated Design Process

    NASA Astrophysics Data System (ADS)

    Pescaru, R.; Kyratsis, P.; Oancea, G.

    2016-11-01

    This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.

  6. Design of a digital adaptive control system for reentry vehicles.

    NASA Technical Reports Server (NTRS)

    Picon-Jimenez, J. L.; Montgomery, R. C.; Grigsby, L. L.

    1972-01-01

    The flying qualities of atmospheric reentry vehicles experience considerable variations due to the wide changes in flight conditions characteristic of reentry trajectories. A digital adaptive control system has been designed to modify the vehicle's dynamic characteristics and to provide desired flying qualities for all flight conditions. This adaptive control system consists of a finite-memory identifier which determines the vehicle's unknown parameters, and a gain computer which calculates feedback gains to satisfy flying quality requirements.

  7. Automation for pattern library creation and in-design optimization

    NASA Astrophysics Data System (ADS)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also

  8. Design of Low Complexity Model Reference Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Johnson, Marcus; Nguyen, Nhan

    2012-01-01

    Flight research experiments have demonstrated that adaptive flight controls can be an effective technology for improving aircraft safety in the event of failures or damage. However, the nonlinear, timevarying nature of adaptive algorithms continues to challenge traditional methods for the verification and validation testing of safety-critical flight control systems. Increasingly complex adaptive control theories and designs are emerging, but only make testing challenges more difficult. A potential first step toward the acceptance of adaptive flight controllers by aircraft manufacturers, operators, and certification authorities is a very simple design that operates as an augmentation to a non-adaptive baseline controller. Three such controllers were developed as part of a National Aeronautics and Space Administration flight research experiment to determine the appropriate level of complexity required to restore acceptable handling qualities to an aircraft that has suffered failures or damage. The controllers consist of the same basic design, but incorporate incrementally-increasing levels of complexity. Derivations of the controllers and their adaptive parameter update laws are presented along with details of the controllers implementations.

  9. Scar-less multi-part DNA assembly design automation

    DOEpatents

    Hillson, Nathan J.

    2016-06-07

    The present invention provides a method of a method of designing an implementation of a DNA assembly. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding flanking homology sequences to each of the DNA oligos. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding optimized overhang sequences to each of the DNA oligos.

  10. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  11. Automated Design and Optimization of Pebble-bed Reactor Cores

    SciTech Connect

    Hans D. Gougar; Abderrafi M. Ougouag; William K. Terry

    2010-07-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  12. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype.

  13. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  14. The Potential of Adaptive Design in Animal Studies.

    PubMed

    Majid, Arshad; Bae, Ok-Nam; Redgrave, Jessica; Teare, Dawn; Ali, Ali; Zemke, Daniel

    2015-10-12

    Clinical trials are the backbone of medical research, and are often the last step in the development of new therapies for use in patients. Prior to human testing, however, preclinical studies using animal subjects are usually performed in order to provide initial data on the safety and effectiveness of prospective treatments. These studies can be costly and time consuming, and may also raise concerns about the ethical treatment of animals when potentially harmful procedures are involved. Adaptive design is a process by which the methods used in a study may be altered while it is being conducted in response to preliminary data or other new information. Adaptive design has been shown to be useful in reducing the time and costs associated with clinical trials, and may provide similar benefits in preclinical animal studies. The purpose of this review is to summarize various aspects of adaptive design and evaluate its potential for use in preclinical research.

  15. Automated Tactical Symbology System (TACSYM): System Design Specifications

    DTIC Science & Technology

    1984-03-01

    138. Personal Demand 39. Microphones 89. CBR 139. Repair Parts 40. Target Designator 90. Chemical 140. Wire 41. Visual Station 91. Combined Arms...allmi for modification of the database. Insertion and removal of data from the database is controlled by DATABsE . C. 2.1.4 Software Suppot t. The

  16. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  17. Intelligent Adaptive Systems: Literature Research of Design Guidance for Intelligent Adaptive Automation and Interfaces

    DTIC Science & Technology

    2007-09-01

    basic’, ‘ business ’, ‘industrial’ and ‘military’. Table 4: Number of references grouped by level of experimentation, peer review and domain...or business target domains; and, green = military target domain. Reference. Full reference of article. Overview. Summary of the main conceptual...Organizational Model (organizational or business processes); • Task Model (high-level tasks and goals of agents in the system); • Agent Model (who

  18. The design of digital-adaptive controllers for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.; Broussard, J. R.; Berry, P. W.

    1976-01-01

    Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.

  19. An overview of the adaptive designs accelerating promising trials into treatments (ADAPT-IT) project.

    PubMed

    Meurer, William J; Lewis, Roger J; Tagle, Danilo; Fetters, Michael D; Legocki, Laurie; Berry, Scott; Connor, Jason; Durkalski, Valerie; Elm, Jordan; Zhao, Wenle; Frederiksen, Shirley; Silbergleit, Robert; Palesch, Yuko; Berry, Donald A; Barsan, William G

    2012-10-01

    Randomized clinical trials, which aim to determine the efficacy and safety of drugs and medical devices, are a complex enterprise with myriad challenges, stakeholders, and traditions. Although the primary goal is scientific discovery, clinical trials must also fulfill regulatory, clinical, and ethical requirements. Innovations in clinical trials methodology have the potential to improve the quality of knowledge gained from trials, the protection of human subjects, and the efficiency of clinical research. Adaptive clinical trial methods represent a broad category of innovations intended to address a variety of long-standing challenges faced by investigators, such as sensitivity to previous assumptions and delayed identification of ineffective treatments. The implementation of adaptive clinical trial methods, however, requires greater planning and simulation compared with a more traditional design, along with more advanced administrative infrastructure for trial execution. The value of adaptive clinical trial methods in exploratory phase (phase 2) clinical research is generally well accepted, but the potential value and challenges of applying adaptive clinical trial methods in large confirmatory phase clinical trials are relatively unexplored, particularly in the academic setting. In the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) project, a multidisciplinary team is studying how adaptive clinical trial methods could be implemented in planning actual confirmatory phase trials in an established, National Institutes of Health-funded clinical trials network. The overarching objectives of ADAPT-IT are to identify and quantitatively characterize the adaptive clinical trial methods of greatest potential value in confirmatory phase clinical trials and to elicit and understand the enthusiasms and concerns of key stakeholders that influence their willingness to try these innovative strategies.

  20. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  1. Automated Interior Lighting Design Software for Base Civil Engineers.

    DTIC Science & Technology

    1987-09-01

    Auditoriums 20 9 Cafeterias 25 10 Computer rooms 50 11 Conference rooms 30 12 Corridors 10 13 Drafting rooms 75 14 Elevator mach. rms is 15 EM. Generator...biological rhythms, regulates production of * hormones, and affects metabolism of specific areas in the brain (30: 11 )." Simply stated, "Research...this process in lighting design, Chapter 11 of this thesis will provide a much more detailed explanation of the process, and also works examples to help

  2. A Multi-Agent Design for Power Distribution Systems Automation

    NASA Astrophysics Data System (ADS)

    Ghorbani, M. Jawad

    A new Multi Agent System (MAS) design for fault location, isolation and restoration in power distribution systems is presented. In proposed approach, when there is a fault in the Power Distribution System (PDS), MAS quickly isolates the fault and restores the service to fault-free zones. Hierarchical coordination strategy is introduced to manage the agents which integrate the advantages of both centralized and decentralized coordination strategies. In this framework, Zone Agent (ZA) locate and isolate the fault based on the locally available information and assist the Feeder Agent (FA) for reconfiguration and restoration. FA can solve the restoration problem using the existing algorithms for the 0-1 Knapsack problem. A novel Q-learning mechanism is also introduced to support the FAs in decision making for restoration. Also a distributed MAS-Based Load Shedding (LS) technique has been used to supply as many of higher priority customers as possible, in case there is more demand than generation. The design is illustrated by the use of simulation case studies for fault location, isolation and restoration on West Virginia Super Circuit (WVSC) and hardware implementation for fault location and isolation in a laboratory platform. The results from the case studies indicate the performance of proposed MAS designs.

  3. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research.

  4. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  5. Using Adaptive Automation to Increase Operator Performance and Decrease Stress in a Satellite Operations Environment

    ERIC Educational Resources Information Center

    Klein, David C.

    2014-01-01

    As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…

  6. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE

    PubMed Central

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V.; Turner, Helen C.; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y. Lawrence; Brenner, D. J.

    2010-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  7. Bayesian response-adaptive designs for basket trials.

    PubMed

    Ventz, Steffen; Barry, William T; Parmigiani, Giovanni; Trippa, Lorenzo

    2017-02-17

    We develop a general class of response-adaptive Bayesian designs using hierarchical models, and provide open source software to implement them. Our work is motivated by recent master protocols in oncology, where several treatments are investigated simultaneously in one or multiple disease types, and treatment efficacy is expected to vary across biomarker-defined subpopulations. Adaptive trials such as I-SPY-2 (Barker et al., 2009) and BATTLE (Zhou et al., 2008) are special cases within our framework. We discuss the application of our adaptive scheme to two distinct research goals. The first is to identify a biomarker subpopulation for which a therapy shows evidence of treatment efficacy, and to exclude other subpopulations for which such evidence does not exist. This leads to a subpopulation-finding design. The second is to identify, within biomarker-defined subpopulations, a set of cancer types for which an experimental therapy is superior to the standard-of-care. This goal leads to a subpopulation-stratified design. Using simulations constructed to faithfully represent ongoing cancer sequencing projects, we quantify the potential gains of our proposed designs relative to conventional non-adaptive designs.

  8. Design principles and algorithms for automated air traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    This paper presents design principles and algorithm for building a real time scheduler. The primary objective of the scheduler is to assign arrival aircraft to a favorable landing runway and schedule them to land at times that minimize delays. A further objective of the scheduler is to allocate delays between high altitude airspace far from the airport and low altitude airspace near the airport. A method of delay allocation is described that minimizes the average operating cost in the presence of errors in controlling aircraft to a specified landing time.

  9. Designing Adaptable Ships: Modularity and Flexibility in Future Ship Designs

    DTIC Science & Technology

    2016-01-01

    web page). v Contents Preface...55 Contents vii... integrated into the new design while reducing the construction cost of the ship. Recommendations We offer both short-term, ship-specific recommendations and

  10. Automated design of two-zero rational Chebychev filters

    NASA Astrophysics Data System (ADS)

    Le, K. H.

    1981-10-01

    The Rational Chebychev function is used to design elliptic-characteristic filters. The filter has N/2 ripples in the passband but only one ripple in the stopband for all orders. As N increases from three, the result is a substantial saving in number of capacitors in the passive ladder realization of the above function as compared to that of traditional elliptic filters of the same order N. The ladder's element values can be expressed as explicit involving only the coefficients of the transfer function. These expressions can also be used for other types of filters. Numerically, the design can be carried out by a Fortran program or a set of programs on a programmable calculator. The user needs only to give the three specifications: the filter order N, the stopband zeros Z, and the passband ripple amount R sub p. The program automatically selects the starting point for the given case and proceeds. The numerical results of the above programs over a range of specifications relate the above specifications to the minimum stopband attenuation.

  11. Automated recycling of chemistry for virtual screening and library design.

    PubMed

    Vainio, Mikko J; Kogej, Thierry; Raubacher, Florian

    2012-07-23

    An early stage drug discovery project needs to identify a number of chemically diverse and attractive compounds. These hit compounds are typically found through high-throughput screening campaigns. The diversity of the chemical libraries used in screening is therefore important. In this study, we describe a virtual high-throughput screening system called Virtual Library. The system automatically "recycles" validated synthetic protocols and available starting materials to generate a large number of virtual compound libraries, and allows for fast searches in the generated libraries using a 2D fingerprint based screening method. Virtual Library links the returned virtual hit compounds back to experimental protocols to quickly assess the synthetic accessibility of the hits. The system can be used as an idea generator for library design to enrich the screening collection and to explore the structure-activity landscape around a specific active compound.

  12. Experimental Design to Evaluate Directed Adaptive Mutation in Mammalian Cells

    PubMed Central

    Chiaro, Christopher R; May, Tobias

    2014-01-01

    Background We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Objective Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. Methods An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. Results We performed the initial stages of characterizing our system

  13. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  14. Adaptive Control Law Design for Model Uncertainty Compensation

    DTIC Science & Technology

    1989-06-14

    AD-A211 712 WRDC-TR-89-3061 ADAPTIVE CONTROL LAW DESIGN FOR MODEL UNCERTAINTY COMPENSATION J. E. SORRELLS DYNETICS , INC. U 1000 EXPLORER BLVD. L Ell...MONITORING ORGANIZATION Dynetics , Inc. (If applicable) Wright Research and Development Center netics,_ _ I _nc.Flight Dynamics Laboratory, AFSC 6c. ADDRESS...controllers designed using Dynetics innovative aporoach were able to equal or surpass the STR and MRAC controllers in terms of performance robustness

  15. Examining Teacher Thinking: Constructing a Process to Design Curricular Adaptations.

    ERIC Educational Resources Information Center

    Udvari-Solner, Alice

    1996-01-01

    This description of a curricular adaptation decision-making process focuses on tenets of reflective practice as teachers design instruction for students in heterogeneous classrooms. A case example illustrates how an elementary teaching team transformed lessons to accommodate a wide range of learners in a multiage first- and second-grade classroom.…

  16. Instructional Design and Adaptation Issues in Distance Learning Via Satellite.

    ERIC Educational Resources Information Center

    Thach, Liz

    1995-01-01

    Discusses a qualitative research study conducted in a distance-learning environment using satellite delivery. Describes changes in instructional design and adaptation issues which faculty and professionals involved in satellite-delivery learning situations used to be successful. (Author/AEF)

  17. Adaptive Liver Stereotactic Body Radiation Therapy: Automated Daily Plan Reoptimization Prevents Dose Delivery Degradation Caused by Anatomy Deformations

    SciTech Connect

    Leinders, Suzanne M.; Breedveld, Sebastiaan; Méndez Romero, Alejandra; Schaart, Dennis; Seppenwoolde, Yvette; Heijmen, Ben J.M.

    2013-12-01

    Purpose: To investigate how dose distributions for liver stereotactic body radiation therapy (SBRT) can be improved by using automated, daily plan reoptimization to account for anatomy deformations, compared with setup corrections only. Methods and Materials: For 12 tumors, 3 strategies for dose delivery were simulated. In the first strategy, computed tomography scans made before each treatment fraction were used only for patient repositioning before dose delivery for correction of detected tumor setup errors. In adaptive second and third strategies, in addition to the isocenter shift, intensity modulated radiation therapy beam profiles were reoptimized or both intensity profiles and beam orientations were reoptimized, respectively. All optimizations were performed with a recently published algorithm for automated, multicriteria optimization of both beam profiles and beam angles. Results: In 6 of 12 cases, violations of organs at risk (ie, heart, stomach, kidney) constraints of 1 to 6 Gy in single fractions occurred in cases of tumor repositioning only. By using the adaptive strategies, these could be avoided (<1 Gy). For 1 case, this needed adaptation by slightly underdosing the planning target volume. For 2 cases with restricted tumor dose in the planning phase to avoid organ-at-risk constraint violations, fraction doses could be increased by 1 and 2 Gy because of more favorable anatomy. Daily reoptimization of both beam profiles and beam angles (third strategy) performed slightly better than reoptimization of profiles only, but the latter required only a few minutes of computation time, whereas full reoptimization took several hours. Conclusions: This simulation study demonstrated that replanning based on daily acquired computed tomography scans can improve liver stereotactic body radiation therapy dose delivery.

  18. Automated design optimization of supersonic airplane wing structures under dynamic constraints.

    NASA Technical Reports Server (NTRS)

    Fox, R. L.; Miura, H.; Rao, S. S.

    1972-01-01

    The problems of the preliminary and first level detail design of supersonic aircraft wings are stated as mathematical programs and solved using automated optimum design techniques. The problem is approached in two phases: the first is a simplified equivalent plate model in which the envelope, plan form and structural parameters are varied to produce a design, the second is a finite element model with fixed configuration in which the material distribution is varied. Constraints include flutter, aeroelastically computed stresses and deflections, natural frequency and a variety of geometric limitations. The Phase I objective is a combination of weight and drag while Phase II is a weight minimization.

  19. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  20. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  1. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  2. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  3. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    SciTech Connect

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, and proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.

  4. Individually designed PALs vs. power optimized PALs adaptation comparison.

    PubMed

    Muždalo, Nataša Vujko; Mihelčič, Matjaž

    2015-03-01

    The practice shows that in everyday life we encounter ever-growing demand for better visual acuity at all viewing distances. The presbyopic population needs correction to far, near and intermediate distance with different dioptric powers. PAL lenses seem to be a comfortable solution. The object of the present study is the analysis of the factors determining adaptation to progressive addition lenses (PAL) of the first-time users. Only novice test persons were chosen in order to avoid the bias of previously worn particular lens design. For optimal results with this type of lens, several individual parameters must be considered: correct refraction, precise ocular and facial measures, and proper mounting of lenses into the frame. Nevertheless, first time wearers encounter various difficulties in the process of adapting to this type of glasses and adaptation time differs greatly between individual users. The question that arises is how much the individual parameters really affect the ease of adaptation and comfort when wearing progressive glasses. To clarify this, in the present study, the individual PAL lenses--Rodenstock's Impression FreeSign (with inclusion of all parameters related to the user's eye and spectacle frame: prescription, pupillary distance, fitting height, back vertex distance, pantoscopic angle and curvature of the frame) were compared to power optimized PAL--Rodenstock's Multigressiv MyView (respecting only prescription power and pupillary distance). Adaptation process was monitored over a period of four weeks. The collected results represent scores of user's subjective impressions, where the users themselves rated their adaptation to new progressive glasses and the degree of subjective visual impression. The results show that adaptation time to fully individually fit PAL is easier and quickly. The information obtained from users is valuable in everyday optometry practice because along with the manufacturer's specifications, the user's experience can

  5. Frequency Adaptability and Waveform Design for OFDM Radar Space-Time Adaptive Processing

    SciTech Connect

    Sen, Satyabrata; Glover, Charles Wayne

    2012-01-01

    We propose an adaptive waveform design technique for an orthogonal frequency division multiplexing (OFDM) radar signal employing a space-time adaptive processing (STAP) technique. We observe that there are inherent variabilities of the target and interference responses in the frequency domain. Therefore, the use of an OFDM signal can not only increase the frequency diversity of our system, but also improve the target detectability by adaptively modifying the OFDM coefficients in order to exploit the frequency-variabilities of the scenario. First, we formulate a realistic OFDM-STAP measurement model considering the sparse nature of the target and interference spectra in the spatio-temporal domain. Then, we show that the optimal STAP-filter weight-vector is equal to the generalized eigenvector corresponding to the minimum generalized eigenvalue of the interference and target covariance matrices. With numerical examples we demonstrate that the resultant OFDM-STAP filter-weights are adaptable to the frequency-variabilities of the target and interference responses, in addition to the spatio-temporal variabilities. Hence, by better utilizing the frequency variabilities, we propose an adaptive OFDM-waveform design technique, and consequently gain a significant amount of STAP-performance improvement.

  6. Design of suboptimal adaptive filter for stochastic systems

    NASA Astrophysics Data System (ADS)

    Ahn, Jun Il; Shin, Vladimir

    2005-12-01

    In this paper, the problem of estimating the system state in for linear discrete-time systems with uncertainties is considered. In [1], [2], we have proposed the fusion formula (FF) for an arbitrary number of correlated and uncorrelated estimates. The FF is applied to detection and filtering problem. The new suboptimal adaptive filter with parallel structure is herein proposed. In consequence of parallel structure of the proposed filter, parallel computers can be used for their design. A lower computational complexity and lower memory demand are achieved with the proposed filter than in the optimal adaptive Lainiotis-Kalman filter. Example demonstrates the accuracy of the new filter.

  7. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  8. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    NASA Astrophysics Data System (ADS)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  9. Design of Adaptive Policy Pathways under Deep Uncertainties

    NASA Astrophysics Data System (ADS)

    Babovic, Vladan

    2013-04-01

    The design of large-scale engineering and infrastructural systems today is growing in complexity. Designers need to consider sociotechnical uncertainties, intricacies, and processes in the long- term strategic deployment and operations of these systems. In this context, water and spatial management is increasingly challenged not only by climate-associated changes such as sea level rise and increased spatio-temporal variability of precipitation, but also by pressures due to population growth and particularly accelerating rate of urbanisation. Furthermore, high investment costs and long term-nature of water-related infrastructure projects requires long-term planning perspective, sometimes extending over many decades. Adaptation to such changes is not only determined by what is known or anticipated at present, but also by what will be experienced and learned as the future unfolds, as well as by policy responses to social and water events. As a result, a pathway emerges. Instead of responding to 'surprises' and making decisions on ad hoc basis, exploring adaptation pathways into the future provide indispensable support in water management decision-making. In this contribution, a structured approach for designing a dynamic adaptive policy based on the concepts of adaptive policy making and adaptation pathways is introduced. Such an approach provides flexibility which allows change over time in response to how the future unfolds, what is learned about the system, and changes in societal preferences. The introduced flexibility provides means for dealing with complexities of adaptation under deep uncertainties. It enables engineering systems to change in the face of uncertainty to reduce impacts from downside scenarios while capitalizing on upside opportunities. This contribution presents comprehensive framework for development and deployment of adaptive policy pathway framework, and demonstrates its performance under deep uncertainties on a case study related to urban

  10. Adaptation of an automated assay for determination of beta-hydroxybutyrate in dogs using a random access analyzer.

    PubMed

    Christopher, Mary M.; Pereira, Jacqueline L.; Brigmon, Robin L.

    1992-01-01

    An automated method for measuring beta-hydroxybutyrate was adapted to the Ciba-Corning 550 Express trade mark random access analyzer. The assay was based on a kinetic reaction utilizing hydroxybutyrate-dehydrogenase. Beta-hydroxybutyrate concentration (mmol/L) was calculated ratiometrically using a 1.0 mmol/l standard. Canine serum, plasma, and urine were used without prior deproteinization and only a 30-microliter sample was required. The method demonstrated good linearity between 0 to 2 mmol/l of beta-hydroxybutyrate. Analytical recovery (accuracy) within these concentrations ranged from 85.8 to 113.3%. Both within-run and day-to-day precision were determined, as was specificity of the assay in the presence of a variety of interfering substances. The automated assay was rapid and economical, with reagent stability maintained for at least 2 weeks at 4 degrees C. This assay can readily be applied toward the assessment of ketoacidosis in dogs, and with further validation, other species.

  11. Design automation techniques for high-resolution current folding and interpolating CMOS A/D converters

    NASA Astrophysics Data System (ADS)

    Gevaert, D.

    2007-05-01

    The design and testing of a 12-bit Analog-to-Digital (A/D) converter, in current mode, arranged in an 8-bit LSB and a 4- bit MSB architecture together with the integration of specialized test building blocks on chip allows the set up of a design automation technique for current folding and interpolation CMOS A/D converter architectures. The presented design methodology focuses on the automation for CMOS A/D building blocks in a flexible target current folding and interpolating architecture for a downscaling technology and for different quality specifications. The comprehensive understanding of all sources of mismatching in the crucial building blocks and the use of physical based mismatch modeling in the prediction of mismatch errors, more adequate and realistic sizing of all transistors will result in an overall area reduction of the A/D converter. In this design the folding degree is 16, the number of folders is 64 and the interpolation level is 4. The number of folders is reduced by creating intermediate folding signals with a 4-level interpolator based on current division techniques. Current comparators detect the zero-crossing between the differential folder output currents. The outputs of the comparators deliver a cyclic thermometer code. The digital synthesis part for decoding and error correction building blocks is a standardized digital standard cell design. The basic building blocks in the target architecture were designed in 0.35μ CMOS technology; they are suitable for topological reuse and are in an automated way downscaled into a 0.18μ CMOS technology.

  12. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  13. The design of service-adaptive engine for robot middleware

    NASA Astrophysics Data System (ADS)

    Baek, BumHyeon; Choi, YongSoon; Park, Hong Seong

    2007-12-01

    In this paper, we propose a design of Service-Adaptive Engine for robot middleware. This middleware called the KOMoR (Korea Object-oriented Middleware of Robot) is a middleware for robot that composed of three layers (Service Layer, Network Adaptation Layer, Network Interface Layer). In particular, Service-Adaptive Engine in Service Layer is responsible for communication between distributed applications and provides a set of features that support development of realistic distributed applications for a robot. Also, it avoids unnecessary complexity, making the middleware easy to learn and to use. For writing application, both client and server consist of a mixture of application code, library code, and code generated from IDL definition called MIDL (Module Interface Definition Language). The Service-Adaptive Engine in SL contains the client-and server-side run-time support for remote communication. The generic part of the Service-Adaptive Engine (that is, the part that is independent of the specific types you have defined in MIDL) is accessed through the SL API. The proxy code is generated from MIDL definitions and, therefore specific to the types of objects and data you have defined in MIDL.

  14. Adaptive Designs for Randomized Trials in Public Health

    PubMed Central

    Brown, C. Hendricks; Have, Thomas R. Ten; Jo, Booil; Dagne, Getachew; Wyman, Peter A.; Muthén, Bengt; Gibbons, Robert D.

    2009-01-01

    In this article, we present a discussion of two general ways in which the traditional randomized trial can be modified or adapted in response to the data being collected. We use the term adaptive design to refer to a trial in which characteristics of the study itself, such as the proportion assigned to active intervention versus control, change during the trial in response to data being collected. The term adaptive sequence of trials refers to a decision-making process that fundamentally informs the conceptualization and conduct of each new trial with the results of previous trials. Our discussion below investigates the utility of these two types of adaptations for public health evaluations. Examples are provided to illustrate how adaptation can be used in practice. From these case studies, we discuss whether such evaluations can or should be analyzed as if they were formal randomized trials, and we discuss practical as well as ethical issues arising in the conduct of these new-generation trials. PMID:19296774

  15. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis.

    PubMed

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior-in a unified fashion-to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts.

  16. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis

    PubMed Central

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior—in a unified fashion—to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial

  17. An Adaptive Staggered Dose Design for a Normal Endpoint.

    PubMed

    Wu, Joseph; Menon, Sandeep; Chang, Mark

    2015-01-01

    In a clinical trial where several doses are compared to a control, a multi-stage design that combines both the selection of the best dose and the confirmation of this selected dose is desirable. An example is the two-stage drop-the-losers or pick-the-winner design, where inferior doses are dropped after interim analysis. Selection of target dose(s) can be based on ranking of observed effects, hypothesis testing with adjustment for multiplicity, or other criteria at interim stages. A number of methods have been proposed and have made significant gains in trial efficiency. However, many of these designs started off with all doses with equal allocation and did not consider prioritizing the doses using existing dose-response information. We propose an adaptive staggered dose procedure that allows explicit prioritization of doses and applies error spending scheme that favors doses with assumed better responses. This design starts off with only a subset of the doses and adaptively adds new doses depending on interim results. Using simulation, we have shown that this design performs better in terms of increased statistical power than the drop-the-losers design given strong prior information of dose response.

  18. A System Approach to Adaptive Multi-Modal Sensor Designs

    DTIC Science & Technology

    2010-02-01

    Email: rhody@cis.rit.edu Program Managers: Dr. Douglas Cochran <douglas.cochran@afosr.af.mil> Dr. Kitt C. Reinhardt <kitt.reinhardt...DEPARTMENT OF COMPUTER SCIENCE CONVENT AVE & 138TH ST SCHOOL OF ENGINEERING NEW YORK, NY 10031 Approved for public release...FA9550-08-1-0199 A System Approach to Adaptive Multi-Modal Sensor Designs 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d

  19. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  20. Launch vehicle payload adapter design with vibration isolation features

    NASA Astrophysics Data System (ADS)

    Thomas, Gareth R.; Fadick, Cynthia M.; Fram, Bryan J.

    2005-05-01

    Payloads, such as satellites or spacecraft, which are mounted on launch vehicles, are subject to severe vibrations during flight. These vibrations are induced by multiple sources that occur between liftoff and the instant of final separation from the launch vehicle. A direct result of the severe vibrations is that fatigue damage and failure can be incurred by sensitive payload components. For this reason a payload adapter has been designed with special emphasis on its vibration isolation characteristics. The design consists of an annular plate that has top and bottom face sheets separated by radial ribs and close-out rings. These components are manufactured from graphite epoxy composites to ensure a high stiffness to weight ratio. The design is tuned to keep the frequency of the axial mode of vibration of the payload on the flexibility of the adapter to a low value. This is the main strategy adopted for isolating the payload from damaging vibrations in the intermediate to higher frequency range (45Hz-200Hz). A design challenge for this type of adapter is to keep the pitch frequency of the payload above a critical value in order to avoid dynamic interactions with the launch vehicle control system. This high frequency requirement conflicts with the low axial mode frequency requirement and this problem is overcome by innovative tuning of the directional stiffnesses of the composite parts. A second design strategy that is utilized to achieve good isolation characteristics is the use of constrained layer damping. This feature is particularly effective at keeping the responses to a minimum for one of the most important dynamic loading mechanisms. This mechanism consists of the almost-tonal vibratory load associated with the resonant burn condition present in any stage powered by a solid rocket motor. The frequency of such a load typically falls in the 45-75Hz range and this phenomenon drives the low frequency design of the adapter. Detailed finite element analysis is

  1. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    NASA Technical Reports Server (NTRS)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  2. Effects of an Advanced Reactor’s Design, Use of Automation, and Mission on Human Operators

    SciTech Connect

    Jeffrey C. Joe; Johanna H. Oxstrand

    2014-06-01

    The roles, functions, and tasks of the human operator in existing light water nuclear power plants (NPPs) are based on sound nuclear and human factors engineering (HFE) principles, are well defined by the plant’s conduct of operations, and have been validated by years of operating experience. However, advanced NPPs whose engineering designs differ from existing light-water reactors (LWRs) will impose changes on the roles, functions, and tasks of the human operators. The plans to increase the use of automation, reduce staffing levels, and add to the mission of these advanced NPPs will also affect the operator’s roles, functions, and tasks. We assert that these factors, which do not appear to have received a lot of attention by the design engineers of advanced NPPs relative to the attention given to conceptual design of these reactors, can have significant risk implications for the operators and overall plant safety if not mitigated appropriately. This paper presents a high-level analysis of a specific advanced NPP and how its engineered design, its plan to use greater levels of automation, and its expanded mission have risk significant implications on operator performance and overall plant safety.

  3. Design of an adaptive controller for a telerobot manipulator

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.; Zhou, Zhen-Lei

    1989-01-01

    The design of a joint-space adaptive control scheme is presented for controlling the slave arm motion of a dual-arm telerobot system developed at Goddard Space Flight Center (GSFC) to study telerobotic operations in space. Each slave arm of the dual-arm system is a kinematically redundant manipulator with 7 degrees of freedom (DOF). Using the concept of model reference adaptive control (MRAC) and Lyapunov direct method, an adatation algorithm is derived which adjusts the PD controller gains of the control scheme. The development of the adaptive control scheme assumes that the slave arm motion is non-compliant and slowly-varying. The implementation of the derived control scheme does not need the computation of the manipulator dynamics, which makes the control scheme sufficiently fast for real-time applications. Computer simulation study performed for the 7-DOF slave arm shows that the developed control scheme can efficiently adapt to sudden change in payloads while tracking various test trajectories such as ramp or sinusoids with negligible position errors.

  4. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  5. Design of a new automated multi-step outflow test apparatus

    NASA Astrophysics Data System (ADS)

    Figueras, J.; Gribb, M. M.; McNamara, J. P.

    2006-12-01

    Modeling flow and transport in the vadose zone requires knowledge of the soil hydraulic properties. Laboratories studies involving vadose zone soils typically include use of the multistep outflow method (MSO), which can provide information about wetting and drying soil-moisture and hydraulic conductivity curves from a single test. However, manual MSO testing is time consuming and measurement errors can be easily introduced. A computer-automated system has been designed to allow convenient measurement of soil-water characteristic curves. Computer-controlled solenoid valves are used to regulate the pressure inside Tempe cells to drain soil samples, and outflow volumes are measured with a pressure transducer. The electronic components of the system are controlled using LabVIEW software. This system has been optimized for undisturbed core samples. System performance has been evaluated by comparing results from undisturbed samples subjected first to manual MSO testing and then automated testing. The automated and manual MSO tests yielded similar drying soil-water characteristic curves. These curves are further compared to in-situ measurements and those obtained using pedotransfer functions for a semi-arid watershed.

  6. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    ERIC Educational Resources Information Center

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  7. Adaptive Optics System Design and Operation at Lick Observatory

    NASA Astrophysics Data System (ADS)

    Olivier, S. S.; Max, C. E.; Avicola, K.; Bissinger, H. D.; Brase, J. M.; Friedman, H. W.; Gavel, D. T.; Salmon, J. T.; Waltjen, K. E.

    1993-12-01

    An adaptive optics system developed for the 40 inch Nickel and 120 inch Shane telescopes at Lick Observatory is described. The adaptive optics system design is based on a 69 actuator continuous-surface deformable mirror and a Hartmann wavefront sensor equipped with a commercial intensified CCD fast-framing camera. The system has been tested at the Cassegrain focus of the 40 inch Nickel telescope where the subaperture diameter is 12 cm. The subaperture slope and mirror control calculations are performed on a four processor single board computer controlled by a Unix workstation. This configuration is capable of up to 1 KHz frame rates. The optical configuration of the system and its interface to the telescope is described. Details of the control system design, operation, and user interface are given. Initial test results emphasizing control system operations of this adaptive optics system using natural reference stars on the 40 inch Nickel telescope are presented. The initial test results are compared to predictions from analyses and simulations. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  8. The VIADUC project: innovation in climate adaptation through service design

    NASA Astrophysics Data System (ADS)

    Corre, L.; Dandin, P.; L'Hôte, D.; Besson, F.

    2015-07-01

    From the French National Adaptation to Climate Change Plan, the "Drias, les futurs du climat" service has been developed to provide easy access to French regional climate projections. This is a major step for the implementation of French Climate Services. The usefulness of this service for the end-users and decision makers involved with adaptation planning at a local scale is investigated. As such, the VIADUC project is: to evaluate and enhance Drias, as well as to imagine future development in support of adaptation. Climate scientists work together with end-users and a service designer. The designer's role is to propose an innovative approach based on the interaction between scientists and citizens. The chosen end-users are three Natural Regional Parks located in the South West of France. The latter parks are administrative entities which gather municipalities having a common natural and cultural heritage. They are also rural areas in which specific economic activities take place, and therefore are concerned and involved in both protecting their environment and setting-up sustainable economic development. The first year of the project has been dedicated to investigation including the questioning of relevant representatives. Three key local economic sectors have been selected: i.e. forestry, pastoral farming and building activities. Working groups were composed of technicians, administrative and maintenance staff, policy makers and climate researchers. The sectors' needs for climate information have been assessed. The lessons learned led to actions which are presented hereinafter.

  9. Adaptive Automation for Human-Robot Teaming in Future Command and Control Systems

    DTIC Science & Technology

    2007-01-01

    Raymond J. Curts, Strategic Consulting, Inc Paul K. Davis, RAND Corporation Petra M . Eggenhofer, Munich Bundeswehr University Elliot Entin, Aptima... Henthorn 2005). Some variant of an adaptive system would therefore be particularly well suited to these situations because of the uneven workload and the...57–71. Barnes, M ., and J. Grossman. 1985. The intelligent assistant for electronic war- fare systems. (NWC TP 5885) China Lake: U.S. Naval Weapons

  10. Reflections on the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) Process—Findings from a Qualitative Study

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.

    2015-01-01

    Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163

  11. Optimal design of an unsupervised adaptive classifier with unknown priors

    NASA Technical Reports Server (NTRS)

    Kazakos, D.

    1974-01-01

    An adaptive detection scheme for M hypotheses was analyzed. It was assumed that the probability density function under each hypothesis was known, and that the prior probabilities of the M hypotheses were unknown and sequentially estimated. Each observation vector was classified using the current estimate of the prior probabilities. Using a set of nonlinear transformations, and applying stochastic approximation theory, an optimally converging adaptive detection and estimation scheme was designed. The optimality of the scheme lies in the fact that convergence to the true prior probabilities is ensured, and that the asymptotic error variance is minimum, for the class of nonlinear transformations considered. An expression for the asymptotic mean square error variance of the scheme was also obtained.

  12. Adaptive designs undertaken in clinical research: a review of registered clinical trials.

    PubMed

    Hatfield, Isabella; Allison, Annabel; Flight, Laura; Julious, Steven A; Dimairo, Munyaradzi

    2016-03-19

    Adaptive designs have the potential to improve efficiency in the evaluation of new medical treatments in comparison to traditional fixed sample size designs. However, they are still not widely used in practice in clinical research. Little research has been conducted to investigate what adaptive designs are being undertaken. This review highlights the current state of registered adaptive designs and their characteristics. The review looked at phase II, II/III and III trials registered on ClinicalTrials.gov from 29 February 2000 to 1 June 2014, supplemented with trials from the National Institute for Health Research register and known adaptive trials. A range of adaptive design search terms were applied to the trials extracted from each database. Characteristics of the adaptive designs were then recorded including funder, therapeutic area and type of adaptation. The results in the paper suggest that the use of adaptive designs has increased. They seem to be most often used in phase II trials and in oncology. In phase III trials, the most popular form of adaptation is the group sequential design. The review failed to capture all trials with adaptive designs, which suggests that the reporting of adaptive designs, such as in clinical trials registers, needs much improving. We recommend that clinical trial registers should contain sections dedicated to the type and scope of the adaptation and that the term 'adaptive design' should be included in the trial title or at least in the brief summary or design sections.

  13. Relocatable, Automated Cost-Benefit Analysis for Marine Sensor Network Design

    PubMed Central

    D’Este, Claire; de Souza, Paulo; Sharman, Chris; Allen, Simon

    2012-01-01

    When designing sensor networks, we need to ensure they produce representative and relevant data, but this must be offset by the financial cost of placing sensors. We describe a novel automated method for generating and combining cost and benefit values to decide on the best sensor locations using information about the specific constraints available in most coastal locations. Costs in maintenance, negotiation, equipment, exposure and communication are estimated using hydrodynamic models and Electronic Navigation Charts. Benefits in maximum coverage and reducing overall error are also determined using model output. This method demonstrates equivalent accuracy at predicting the whole system to expert-chosen locations, whilst significantly reducing the estimated costs. PMID:22736982

  14. Adapting the γ-H2AX Assay for Automated Processing in Human Lymphocytes. 1. Technological Aspects

    PubMed Central

    Turner, Helen C.; Brenner, David J.; Chen, Youhua; Bertucci, Antonella; Zhang, Jian; Wang, Hongliang; Lyulko, Oleksandra V.; Xu, Yanping; Shuryak, Igor; Schaefer, Julia; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Garty, Guy

    2011-01-01

    The immunofluorescence-based detection of γ-H2AX is a reliable and sensitive method for quantitatively measuring DNA double-strand breaks (DSBs) in irradiated samples. Since H2AX phosphorylation is highly linear with radiation dose, this well-established biomarker is in current use in radiation biodosimetry. At the Center for High-Throughput Minimally Invasive Radiation Biodosimetry, we have developed a fully automated high-throughput system, the RABIT (Rapid Automated Biodosimetry Tool), that can be used to measure γ-H2AX yields from fingerstick-derived samples of blood. The RABIT workstation has been designed to fully automate the γ-H2AX immunocytochemical protocol, from the isolation of human blood lymphocytes in heparin-coated PVC capillaries to the immunolabeling of γ-H2AX protein and image acquisition to determine fluorescence yield. High throughput is achieved through the use of purpose-built robotics, lymphocyte handling in 96-well filter-bottomed plates, and high-speed imaging. The goal of the present study was to optimize and validate the performance of the RABIT system for the reproducible and quantitative detection of γ-H2AX total fluorescence in lymphocytes in a multiwell format. Validation of our biodosimetry platform was achieved by the linear detection of a dose-dependent increase in γ-H2AX fluorescence in peripheral blood samples irradiated ex vivo with γ rays over the range 0 to 8 Gy. This study demonstrates for the first time the optimization and use of our robotically based biodosimetry workstation to successfully quantify γ-H2AX total fluorescence in irradiated peripheral lymphocytes. PMID:21388271

  15. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume.

  16. An Overview of the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) Project

    PubMed Central

    Meurer, William J.; Lewis, Roger J.; Tagle, Danilo; Fetters, Michael D; Legocki, Laurie; Berry, Scott; Connor, Jason; Durkalski, Valerie; Elm, Jordan; Zhao, Wenle; Frederiksen, Shirley; Silbergleit, Robert; Palesch, Yuko; Berry, Donald A.; Barsan, William G.

    2013-01-01

    Randomized clinical trials, which aim to determine the efficacy and safety of drugs and medical devices, are a complex enterprise with myriad challenges, stakeholders, and traditions. While the primary goal is scientific discovery, clinical trials must also fulfill regulatory, clinical, and ethical requirements. Innovations in clinical trials methodology have the potential to improve the quality of knowledge gained from trials, the protection of human subjects, and the efficiency of clinical research. Adaptive clinical trial (ACT) methods represent a broad category of innovations intended to address a variety of long-standing challenges faced by investigators, such as sensitivity to prior assumptions and delayed identification of ineffective treatments. The implementation of ACT methods, however, requires greater planning and simulation compared to a more traditional design, along with more advanced administrative infrastructure for trial execution. The value of ACT methods in exploratory phase (phase II) clinical research is generally well accepted, but the potential value and challenges of applying ACT methods in large confirmatory phase clinical trials is relatively unexplored, particularly in the academic setting. In the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) project, a multidisciplinary team is studying how ACT methods could be implemented in planning actual confirmatory phase trials in an established, NIH funded clinical trials network. The overarching objectives of ADAPT-IT are to identify and quantitatively characterize the ACT methods of greatest potential value in confirmatory phase clinical trials, and to elicit and understand the enthusiasms and concerns of key stakeholders that influence their willingness to try these innovative strategies. PMID:22424650

  17. An automated multi-modal object analysis approach to coronary calcium scoring of adaptive heart isolated MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-02-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. This can be challenging for a human observer as it is difficult to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. The inclusion or exclusion of false positive or true positive calcified plaques respectively will alter the patient calcium score incorrectly, thus leading to the possibility of incorrect treatment prescription. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the Volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the requirement and

  18. Automated design and optimization of flexible booster autopilots via linear programming, volume 1

    NASA Technical Reports Server (NTRS)

    Hauser, F. D.

    1972-01-01

    A nonlinear programming technique was developed for the automated design and optimization of autopilots for large flexible launch vehicles. This technique, which resulted in the COEBRA program, uses the iterative application of linear programming. The method deals directly with the three main requirements of booster autopilot design: to provide (1) good response to guidance commands; (2) response to external disturbances (e.g. wind) to minimize structural bending moment loads and trajectory dispersions; and (3) stability with specified tolerances on the vehicle and flight control system parameters. The method is applicable to very high order systems (30th and greater per flight condition). Examples are provided that demonstrate the successful application of the employed algorithm to the design of autopilots for both single and multiple flight conditions.

  19. Design of Sequentially Randomized Trials for Testing Adaptive Treatment Strategies

    PubMed Central

    Ogbagaber, Semhar B.; Karp, Jordan; Wahed, Abdus S.

    2016-01-01

    An adaptive treatment strategy (ATS) is an outcome-guided algorithm that allows personalized treatment of complex diseases based on patients’ disease status and treatment history. Conditions such as AIDS, depression, and cancer usually require several stages of treatment due to the chronic, multifactorial nature of illness progression and management. Sequential multiple assignment randomized (SMAR) designs permit simultaneous inference about multiple ATSs, where patients are sequentially randomized to treatments at different stages depending upon response status. The purpose of the article is to develop a sample size formula to ensure adequate power for comparing two or more ATSs. Based on a Wald-type statistic for comparing multiple ATSs with a continuous endpoint, we develop a sample size formula and test it through simulation studies. We show via simulation that the proposed sample size formula maintains the nominal power. The proposed sample size formula is not applicable to designs with time-to-event endpoints but the formula will be useful for practitioners while designing SMAR trials to compare adaptive treatment strategies. PMID:26412033

  20. Optimal Pid Controller Design Using Adaptive Vurpso Algorithm

    NASA Astrophysics Data System (ADS)

    Zirkohi, Majid Moradi

    2015-04-01

    The purpose of this paper is to improve theVelocity Update Relaxation Particle Swarm Optimization algorithm (VURPSO). The improved algorithm is called Adaptive VURPSO (AVURPSO) algorithm. Then, an optimal design of a Proportional-Integral-Derivative (PID) controller is obtained using the AVURPSO algorithm. An adaptive momentum factor is used to regulate a trade-off between the global and the local exploration abilities in the proposed algorithm. This operation helps the system to reach the optimal solution quickly and saves the computation time. Comparisons on the optimal PID controller design confirm the superiority of AVURPSO algorithm to the optimization algorithms mentioned in this paper namely the VURPSO algorithm, the Ant Colony algorithm, and the conventional approach. Comparisons on the speed of convergence confirm that the proposed algorithm has a faster convergence in a less computation time to yield a global optimum value. The proposed AVURPSO can be used in the diverse areas of optimization problems such as industrial planning, resource allocation, scheduling, decision making, pattern recognition and machine learning. The proposed AVURPSO algorithm is efficiently used to design an optimal PID controller.

  1. Optimizing RF gun cavity geometry within an automated injector design system

    SciTech Connect

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  2. Automated registration of large deformations for adaptive radiation therapy of prostate cancer

    SciTech Connect

    Godley, Andrew; Ahunbay, Ergun; Peng Cheng; Li, X. Allen

    2009-04-15

    Available deformable registration methods are often inaccurate over large organ variation encountered, for example, in the rectum and bladder. The authors developed a novel approach to accurately and effectively register large deformations in the prostate region for adaptive radiation therapy. A software tool combining a fast symmetric demons algorithm and the use of masks was developed in C++ based on ITK libraries to register CT images acquired at planning and before treatment fractions. The deformation field determined was subsequently used to deform the delivered dose to match the anatomy of the planning CT. The large deformations involved required that the bladder and rectum volume be masked with uniform intensities of -1000 and 1000 HU, respectively, in both the planning and treatment CTs. The tool was tested for five prostate IGRT patients. The average rectum planning to treatment contour overlap improved from 67% to 93%, the lowest initial overlap is 43%. The average bladder overlap improved from 83% to 98%, with a lowest initial overlap of 60%. Registration regions were set to include a volume receiving 4% of the maximum dose. The average region was 320x210x63, taking approximately 9 min to register on a dual 2.8 GHz Linux system. The prostate and seminal vesicles were correctly placed even though they are not masked. The accumulated doses for multiple fractions with large deformation were computed and verified. The tool developed can effectively supply the previously delivered dose for adaptive planning to correct for interfractional changes.

  3. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  4. An Adaptive Hybrid Genetic Algorithm for Improved Groundwater Remediation Design

    NASA Astrophysics Data System (ADS)

    Espinoza, F. P.; Minsker, B. S.; Goldberg, D. E.

    2001-12-01

    Identifying optimal designs for a groundwater remediation system is computationally intensive, especially for complex, nonlinear problems such as enhanced in situ bioremediation technology. To improve performance, we apply a hybrid genetic algorithm (HGA), which is a two-step solution method: a genetic algorithm (GA) for global search using the entire population and then a local search (LS) to improve search speed for only a few individuals in the population. We implement two types of HGAs: a non-adaptive HGA (NAHGA), whose operations are invariant throughout the run, and a self-adaptive HGA (SAHGA), whose operations adapt to the performance of the algorithm. The best settings of the two HGAs for optimal performance are then investigated for a groundwater remediation problem. The settings include the frequency of LS with respect to the normal GA evaluation, probability of individual selection for LS, evolution criterion for LS (Lamarckian or Baldwinian), and number of local search iterations. A comparison of the algorithms' performance under different settings will be presented.

  5. An adaptive multimeme algorithm for designing HIV multidrug therapies.

    PubMed

    Neri, Ferrante; Toivanen, Jari; Cascella, Giuseppe Leonardo; Ong, Yew-Soon

    2007-01-01

    This paper proposes a period representation for modeling the multidrug HIV therapies and an Adaptive Multimeme Algorithm (AMmA) for designing the optimal therapy. The period representation offers benefits in terms of flexibility and reduction in dimensionality compared to the binary representation. The AMmA is a memetic algorithm which employs a list of three local searchers adaptively activated by an evolutionary framework. These local searchers, having different features according to the exploration logic and the pivot rule, have the role of exploring the decision space from different and complementary perspectives and, thus, assisting the standard evolutionary operators in the optimization process. Furthermore, the AMmA makes use of an adaptation which dynamically sets the algorithmic parameters in order to prevent stagnation and premature convergence. The numerical results demonstrate that the application of the proposed algorithm leads to very efficient medication schedules which quickly stimulate a strong immune response to HIV. The earlier termination of the medication schedule leads to lesser unpleasant side effects for the patient due to strong antiretroviral therapy. A numerical comparison shows that the AMmA is more efficient than three popular metaheuristics. Finally, a statistical test based on the calculation of the tolerance interval confirms the superiority of the AMmA compared to the other methods for the problem under study.

  6. Automated object extraction from remote sensor image based on adaptive thresholding technique

    NASA Astrophysics Data System (ADS)

    Zhao, Tongzhou; Ma, Shuaijun; Li, Jin; Ming, Hui; Luo, Xiaobo

    2009-10-01

    Detection and extraction of the dim moving small objects in the infrared image sequences is an interesting research area. A system for detection of the dim moving small targets in the IR image sequences is presented, and a new algorithm having high performance for extracting moving small targets in infrared image sequences containing cloud clutter is proposed in the paper. This method can get the better detection precision than some other methods, and two independent units can realize the calculative process. The novelty of the algorithm is that it uses adaptive thresholding technique of the moving small targets in both the spatial domain and temporal domain. The results of experiment show that the algorithm we presented has high ratio of detection precision.

  7. Optical Design for Extremely Large Telescope Adaptive Optics Systems

    SciTech Connect

    Bauman, Brian J.

    2003-01-01

    Designing an adaptive optics (AO) system for extremely large telescopes (ELT's) will present new optical engineering challenges. Several of these challenges are addressed in this work, including first-order design of multi-conjugate adaptive optics (MCAO) systems, pyramid wavefront sensors (PWFS's), and laser guide star (LGS) spot elongation. MCAO systems need to be designed in consideration of various constraints, including deformable mirror size and correction height. The y,{bar y} method of first-order optical design is a graphical technique that uses a plot with marginal and chief ray heights as coordinates; the optical system is represented as a segmented line. This method is shown to be a powerful tool in designing MCAO systems. From these analyses, important conclusions about configurations are derived. PWFS's, which offer an alternative to Shack-Hartmann (SH) wavefront sensors (WFS's), are envisioned as the workhorse of layer-oriented adaptive optics. Current approaches use a 4-faceted glass pyramid to create a WFS analogous to a quad-cell SH WFS. PWFS's and SH WFS's are compared and some newly-considered similarities and PWFS advantages are presented. Techniques to extend PWFS's are offered: First, PWFS's can be extended to more pixels in the image by tiling pyramids contiguously. Second, pyramids, which are difficult to manufacture, can be replaced by less expensive lenslet arrays. An approach is outlined to convert existing SH WFS's to PWFS's for easy evaluation of PWFS's. Also, a demonstration of PWFS's in sensing varying amounts of an aberration is presented. For ELT's, the finite altitude and finite thickness of LGS's means that the LGS will appear elongated from the viewpoint of subapertures not directly under the telescope. Two techniques for dealing with LGS spot elongation in SH WFS's are presented. One method assumes that the laser will be pulsed and uses a segmented micro-electromechanical system (MEMS) to track the LGS light subaperture by

  8. Optical design of the adaptive optics laser guide star system

    SciTech Connect

    Bissinger, H.

    1994-11-15

    The design of an adaptive optics package for the 3 meter Lick telescope is presented. This instrument package includes a 69 actuator deformable mirror and a Hartmann type wavefront sensor operating in the visible wavelength; a quadrant detector for the tip-tile sensor and a tip-tilt mirror to stabilize atmospheric first order tip-tile errors. A high speed computer drives the deformable mirror to achieve near diffraction limited imagery. The different optical components and their individual design constraints are described. motorized stages and diagnostics tools are used to operate and maintain alignment throughout observation time from a remote control room. The expected performance are summarized and actual results of astronomical sources are presented.

  9. Design of infrasound-detection system via adaptive LMSTDE algorithm

    NASA Technical Reports Server (NTRS)

    Khalaf, C. S.; Stoughton, J. W.

    1984-01-01

    A proposed solution to an aviation safety problem is based on passive detection of turbulent weather phenomena through their infrasonic emission. This thesis describes a system design that is adequate for detection and bearing evaluation of infrasounds. An array of four sensors, with the appropriate hardware, is used for the detection part. Bearing evaluation is based on estimates of time delays between sensor outputs. The generalized cross correlation (GCC), as the conventional time-delay estimation (TDE) method, is first reviewed. An adaptive TDE approach, using the least mean square (LMS) algorithm, is then discussed. A comparison between the two techniques is made and the advantages of the adaptive approach are listed. The behavior of the GCC, as a Roth processor, is examined for the anticipated signals. It is shown that the Roth processor has the desired effect of sharpening the peak of the correlation function. It is also shown that the LMSTDE technique is an equivalent implementation of the Roth processor in the time domain. A LMSTDE lead-lag model, with a variable stability coefficient and a convergence criterion, is designed.

  10. Adaptive strategies in designing the simultaneous global drug development program.

    PubMed

    Yuan, Zhilong; Chen, Gang; Huang, Qin

    2016-01-01

    Many methods have been proposed to account for the potential impact of ethnic/regional factors when extrapolating results from multiregional clinical trials (MRCTs) to targeted ethnic (TE) patients, i.e., "bridging." Most of them either focused on TE patients in the MRCT (i.e., internal bridging) or a separate local clinical trial (LCT) (i.e., external bridging). Huang et al. (2012) integrated both bridging concepts in their method for the Simultaneous Global Drug Development Program (SGDDP) which designs both the MRCT and the LCT prospectively and combines patients in both trials by ethnic origin, i.e., TE vs. non-TE (NTE). The weighted Z test was used to combine information from TE and NTE patients to test with statistical rigor whether a new treatment is effective in the TE population. Practically, the MRCT is often completed before the LCT. Thus to increase the power for the SGDDP and/or obtain more informative data in TE patients, we may use the final results from the MRCT to re-evaluate initial assumptions (e.g., effect sizes, variances, weight), and modify the LCT accordingly. We discuss various adaptive strategies for the LCT such as sample size reassessment, population enrichment, endpoint change, and dose adjustment. As an example, we extend a popular adaptive design method to re-estimate the sample size for the LCT, and illustrate it for a normally distributed endpoint.

  11. Rapid prototyping of an automated video surveillance system: a hardware-software co-design approach

    NASA Astrophysics Data System (ADS)

    Ngo, Hau T.; Rakvic, Ryan N.; Broussard, Randy P.; Ives, Robert W.

    2011-06-01

    FPGA devices with embedded DSP and memory blocks, and high-speed interfaces are ideal for real-time video processing applications. In this work, a hardware-software co-design approach is proposed to effectively utilize FPGA features for a prototype of an automated video surveillance system. Time-critical steps of the video surveillance algorithm are designed and implemented in the FPGAs logic elements to maximize parallel processing. Other non timecritical tasks are achieved by executing a high level language program on an embedded Nios-II processor. Pre-tested and verified video and interface functions from a standard video framework are utilized to significantly reduce development and verification time. Custom and parallel processing modules are integrated into the video processing chain by Altera's Avalon Streaming video protocol. Other data control interfaces are achieved by connecting hardware controllers to a Nios-II processor using Altera's Avalon Memory Mapped protocol.

  12. SBROME: a scalable optimization and module matching framework for automated biosystems design.

    PubMed

    Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias

    2013-05-17

    The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.

  13. ADS: A FORTRAN program for automated design synthesis: Version 1.10

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1985-01-01

    A new general-purpose optimization program for engineering design is described. ADS (Automated Design Synthesis - Version 1.10) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels: strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples, and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired.

  14. Feasibility of Automated Adaptive GCA (Ground Controlled Approach) Controller Training System.

    ERIC Educational Resources Information Center

    Feuge, Robert L.; And Others

    An analysis of the conceptual feasibility of using automatic speech recognition and understanding technology in the design of an advanced training system was conducted. The analysis specifically explored application to Ground Controlled Approach (GCA) controller training. A systems engineering approach was followed to determine the feasibility of…

  15. Robotic automation for space: planetary surface exploration, terrain-adaptive mobility, and multirobot cooperative tasks

    NASA Astrophysics Data System (ADS)

    Schenker, Paul S.; Huntsberger, Terrance L.; Pirjanian, Paolo; Baumgartner, Eric T.; Aghazarian, Hrand; Trebi-Ollennu, Ashitey; Leger, Patrick C.; Cheng, Yang; Backes, Paul G.; Tunstel, Edward; Dubowsky, Steven; Iagnemma, Karl D.; McKee, Gerard T.

    2001-10-01

    During the last decade, there has been significant progress toward a supervised autonomous robotic capability for remotely controlled scientific exploration of planetary surfaces. While planetary exploration potentially encompasses many elements ranging from orbital remote sensing to subsurface drilling, the surface robotics element is particularly important to advancing in situ science objectives. Surface activities include a direct characterization of geology, mineralogy, atmosphere and other descriptors of current and historical planetary processes-and ultimately-the return of pristine samples to Earth for detailed analysis. Toward these ends, we have conducted a broad program of research on robotic systems for scientific exploration of the Mars surface, with minimal remote intervention. The goal is to enable high productivity semi-autonomous science operations where available mission time is concentrated on robotic operations, rather than up-and-down-link delays. Results of our work include prototypes for landed manipulators, long-ranging science rovers, sampling/sample return mobility systems, and more recently, terrain-adaptive reconfigurable/modular robots and closely cooperating multiple rover systems. The last of these are intended to facilitate deployment of planetary robotic outposts for an eventual human-robot sustained scientific presence. We overview our progress in these related areas of planetary robotics R&D, spanning 1995-to-present.

  16. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-04-29

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre -defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  17. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2013-01-08

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  18. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  19. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  20. Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate

    NASA Astrophysics Data System (ADS)

    Samaras, C.; Cook, L.

    2015-12-01

    Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.

  1. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  2. Models of an Integrated Design Data Base in Support of a Design Automation System.

    DTIC Science & Technology

    1982-12-01

    Glossary of Terms....................161 C. Questionnaire.......................163 D. Software Engineering Tools ad Tecniques 16 Vita...independence is to clearly differentiate between the logical and physical aspects of data base management . These differ- ences include data base design...applications so that the complexity of the design and verification tasks is reduced to a manageable level. Large amounts of data and a variety of design

  3. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  4. Designs and Concept-Reliance of a Fully Automated High Content Screening Platform

    PubMed Central

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2013-01-01

    High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489

  5. Using dual-energy x-ray imaging to enhance automated lung tumor tracking during real-time adaptive radiotherapy

    SciTech Connect

    Menten, Martin J. Fast, Martin F.; Nill, Simeon; Oelfke, Uwe

    2015-12-15

    Purpose: Real-time, markerless localization of lung tumors with kV imaging is often inhibited by ribs obscuring the tumor and poor soft-tissue contrast. This study investigates the use of dual-energy imaging, which can generate radiographs with reduced bone visibility, to enhance automated lung tumor tracking for real-time adaptive radiotherapy. Methods: kV images of an anthropomorphic breathing chest phantom were experimentally acquired and radiographs of actual lung cancer patients were Monte-Carlo-simulated at three imaging settings: low-energy (70 kVp, 1.5 mAs), high-energy (140 kVp, 2.5 mAs, 1 mm additional tin filtration), and clinical (120 kVp, 0.25 mAs). Regular dual-energy images were calculated by weighted logarithmic subtraction of high- and low-energy images and filter-free dual-energy images were generated from clinical and low-energy radiographs. The weighting factor to calculate the dual-energy images was determined by means of a novel objective score. The usefulness of dual-energy imaging for real-time tracking with an automated template matching algorithm was investigated. Results: Regular dual-energy imaging was able to increase tracking accuracy in left–right images of the anthropomorphic phantom as well as in 7 out of 24 investigated patient cases. Tracking accuracy remained comparable in three cases and decreased in five cases. Filter-free dual-energy imaging was only able to increase accuracy in 2 out of 24 cases. In four cases no change in accuracy was observed and tracking accuracy worsened in nine cases. In 9 out of 24 cases, it was not possible to define a tracking template due to poor soft-tissue contrast regardless of input images. The mean localization errors using clinical, regular dual-energy, and filter-free dual-energy radiographs were 3.85, 3.32, and 5.24 mm, respectively. Tracking success was dependent on tumor position, tumor size, imaging beam angle, and patient size. Conclusions: This study has highlighted the influence of

  6. An Automated Tool for Developing Experimental Designs: The Computer-Aided Design Reference for Experiments (CADRE)

    DTIC Science & Technology

    2009-01-01

    survey procedures, and cognitive task analysis), system design methods (e.g., focus groups , design guidelines, specifications, and requirements), and...LABORATORY - HRED ATTN AMSRD ARL HR MZ A DAVISON 199 E 4TH ST STE C TECH PARK BLG 2 FT LEONARD WOOD MO 65473-1949 1 ARMY RSCH LABORATORY

  7. Design of high temperature adaptability cassegrain collimation system

    NASA Astrophysics Data System (ADS)

    Chen, Zhibin; Song, Yan; Liu, Xianhong; Xiao, Wenjian

    2014-12-01

    Collimation system is an indispensable part of the photoelectric detection equipment. Aimed at meeting the demand of field on-line detection for photoelectric system, the system must have higher requirements for its volume, quality and the anti-interference ability of all sorts of complex weather conditions. In order to solve this problem, this paper designed a kind of high temperature adaptability reflex cassegrain collimation system. First the technical indexes of the system was put forward according to the requirements of practical application, then the initial structure parameters was calculated by gaussian optical computing and optimized processing through Zemax. The simulation results showed that the MTF of the system was close to the diffraction limit, which had a good image quality. The system structure tube adopted hard steel material; the primary mirror and secondary mirror used low expansion coefficient of microcrystalline glass, which effectively reduced the deformation due to temperature difference and remained little change in quality and volume at the same time. The experiment results in high and low temperature environments also showed that the collimation system could keep within 30 "beam divergence angle, which proved to have good temperature adaptability, so that it can be used in the field of complex bad conditions.

  8. Adaptive antenna arrays for satellite communications: Design and testing

    NASA Technical Reports Server (NTRS)

    Gupta, I. J.; Swarner, W. G.; Walton, E. K.

    1985-01-01

    When two separate antennas are used with each feedback loop to decorrelate noise, the antennas should be located such that the phase of the interfering signal in the two antennas is the same while the noise in them is uncorrelated. Thus, the antenna patterns and spatial distribution of the auxiliary antennas are quite important and should be carefully selected. The selection and spatial distribution of auxiliary elements is discussed when the main antenna is a center fed reflector antenna. It is shown that offset feeds of the reflector antenna can be used as auxiliary elements of an adaptive array to suppress weak interfering signals. An experimental system is designed to verify the theoretical analysis. The details of the experimental systems are presented.

  9. Wireless thermal sensor network with adaptive low power design.

    PubMed

    Lee, Ho-Yin; Chen, Shih-Lun; Chen, Chiung-An; Huang, Hong-Yi; Luo, Ching-Hsing

    2007-01-01

    There is an increasing need to develop flexible, reconfigurable, and intelligent low power wireless sensor network (WSN) system for healthcare applications. Technical advancements in micro-sensors, MEMS devices, low power electronics, and radio frequency circuits have enabled the design and development of such highly integrated system. In this paper, we present our proposed wireless thermal sensor network system, which is separated into control and data paths. Both of these paths have their own transmission frequencies. The control path sends the power and function commands from computer to each sensor elements by 2.4GHz RF circuits and the data path transmits measured data by 2.4GHz in sensor layer and 60GHz in higher layers. This hierarchy architecture would make reconfigurable mapping and pipeline applications on WSN possibly, and the average power consumption can be efficiently reduced about 60% by using the adaptive technique.

  10. Impacting patient outcomes through design: acuity adaptable care/universal room design.

    PubMed

    Brown, Katherine Kay; Gallant, Dennis

    2006-01-01

    To succeed in today's challenging healthcare environment, hospitals must examine their impact on customers--patients and families--staff and physicians. By using competitive facility design and incorporating evidence-based concepts such as the acuity adaptable care delivery model and the universal room, the hospital will realize an impact on patient satisfaction that will enhance market share, on physician satisfaction that will foster loyalty, and on staff satisfaction that will decrease turnover. At the same time, clinical outcomes such as a reduction in mortality and complications and efficiencies such as a reduction in length of stay and minimization of hospital costs through the elimination of transfers can be gained. The results achieved are dependent on the principles used in designing the patient room that should focus on maximizing patient safety and improving healing. This article will review key design elements that support the success of an acuity adaptable unit such as the use of a private room with zones dedicated to patients, families, and staff, healing environment, technology, and decentralized nursing stations that support the success of the acuity adaptable unit. Outcomes of institutions currently utilizing the acuity adaptable concept will be reviewed.

  11. Analysis and design of an adaptive lightweight satellite mirror

    NASA Astrophysics Data System (ADS)

    Duerr, Johannes K.; Honke, Robert; Alberti, Mathias V.; Sippel, Rudolf

    2002-07-01

    Future scientific space missions based on interferometric optical and infrared astronomical instruments are currently under development in the United States as well as in Europe. These instruments require optical path length accuracy in the order of a few nanometers across structural dimensions of several meters. This puts extreme demands on static and dynamic structural stability. It is expected that actively controlled, adaptive structures will increasingly have to be used for these satellite applications to overcome the limits of passive structural accuracy. Based on the evaluation of different piezo-active concepts presented two years ago analysis and design of an adaptive lightweight satellite mirror primarily made of carbon-fiber reinforced plastic with embedded piezoceramic actuators for shape control is being described. Simulation of global mirror performance takes different wavefront-sensors and controls for several cases of loading into account. In addition extensive finite-element optimization of various structural details has been performed. Local material properties of sub-assemblies or geometry effects at the edges of the structure are investigated with respect to their impact on mirror performance. One important result of the analysis was the lay-out of actuator arrays consisting of specifically designed and custom made piezoceramic actuators. Prototype manufacturing and testing of active sub-components is described in detail. The results obtained served as a basis for a final update of finite-element models. The paper concludes with an outline on manufacturing, testing, and space qualification of the prototype demonstrator of an actively controllable lightweight satellite mirror currently under way. The research work presented in this paper is part of the German industrial research project 'ADAPTRONIK'.

  12. Design, realization and structural testing of a compliant adaptable wing

    NASA Astrophysics Data System (ADS)

    Molinari, G.; Quack, M.; Arrieta, A. F.; Morari, M.; Ermanni, P.

    2015-10-01

    This paper presents the design, optimization, realization and testing of a novel wing morphing concept, based on distributed compliance structures, and actuated by piezoelectric elements. The adaptive wing features ribs with a selectively compliant inner structure, numerically optimized to achieve aerodynamically efficient shape changes while simultaneously withstanding aeroelastic loads. The static and dynamic aeroelastic behavior of the wing, and the effect of activating the actuators, is assessed by means of coupled 3D aerodynamic and structural simulations. To demonstrate the capabilities of the proposed morphing concept and optimization procedure, the wings of a model airplane are designed and manufactured according to the presented approach. The goal is to replace conventional ailerons, thus to achieve controllability in roll purely by morphing. The mechanical properties of the manufactured components are characterized experimentally, and used to create a refined and correlated finite element model. The overall stiffness, strength, and actuation capabilities are experimentally tested and successfully compared with the numerical prediction. To counteract the nonlinear hysteretic behavior of the piezoelectric actuators, a closed-loop controller is implemented, and its capability of accurately achieving the desired shape adaptation is evaluated experimentally. Using the correlated finite element model, the aeroelastic behavior of the manufactured wing is simulated, showing that the morphing concept can provide sufficient roll authority to allow controllability of the flight. The additional degrees of freedom offered by morphing can be also used to vary the plane lift coefficient, similarly to conventional flaps. The efficiency improvements offered by this technique are evaluated numerically, and compared to the performance of a rigid wing.

  13. Design and implementation of an automated secondary cooling system for the continuous casting of billets.

    PubMed

    Chaudhuri, Subhasis; Singh, Rajeev Kumar; Patwari, Kuntal; Majumdar, Susanta; Ray, Asim Kumar; Singh, Arun Kumar Prasad; Neogi, Nirbhar

    2010-01-01

    This paper describes a heat transfer model based automatic secondary cooling system for a billet caster. The model aims to minimize the variation in surface temperature and excessive reheating of the billet strands. It is also used to avoid the low ductility trough of the solidifying steel, which aggravates the tendency of steel to crack. The system has been designed and implemented in an integrated steel plant. A Programmable Logic Controller (PLC) based automation system has been developed to control the water flow in the secondary cooling zones of the strand. The results obtained through field trials have shown complete elimination of internal and off-corner cracks for the fifty billet samples that were monitored.

  14. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  15. Research Initiatives and Preliminary Results In Automation Design In Airspace Management in Free Flight

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".

  16. Automated fit quantification of tibial nail designs during the insertion using computer three-dimensional modelling.

    PubMed

    Amarathunga, Jayani P; Schuetz, Michael A; Yarlagadda, Prasad Kvd; Schmutz, Beat

    2014-12-01

    Intramedullary nailing is the standard fixation method for displaced diaphyseal fractures of the tibia. An optimal nail design should both facilitate insertion and anatomically fit the bone geometry at its final position in order to reduce the risk of stress fractures and malalignments. Due to the nonexistence of suitable commercial software, we developed a software tool for the automated fit assessment of nail designs. Furthermore, we demonstrated that an optimised nail, which fits better at the final position, is also easier to insert. Three-dimensional models of two nail designs and 20 tibiae were used. The fitting was quantified in terms of surface area, maximum distance, sum of surface areas and sum of maximum distances by which the nail was protruding into the cortex. The software was programmed to insert the nail into the bone model and to quantify the fit at defined increment levels. On average, the misfit during the insertion in terms of the four fitting parameters was smaller for the Expert Tibial Nail Proximal bend (476.3 mm(2), 1.5 mm, 2029.8 mm(2), 6.5 mm) than the Expert Tibial Nail (736.7 mm(2), 2.2 mm, 2491.4 mm(2), 8.0 mm). The differences were statistically significant (p ≤ 0.05). The software could be used by nail implant manufacturers for the purpose of implant design validation.

  17. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    NASA Technical Reports Server (NTRS)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  18. Automated Work Package: Initial Wireless Communication Platform Design, Development, and Evaluation

    SciTech Connect

    Al Rashdan, Ahmad Yahya Mohammad; Agarwal, Vivek

    2016-03-01

    The Department of Energy’s Light Water Reactor Sustainability Program is developing the scientific basis to ensure long-term reliability, productivity, safety, and security of the nuclear power industry in the United States. The Instrumentation, Information, and Control (II&C) pathway of the program aims to increase the role of advanced II&C technologies to achieve this objective. One of the pathway efforts at Idaho National Laboratory (INL) is to improve the work packages execution process by replacing the expensive, inefficient, bulky, complex, and error-prone paper-based work orders with automated work packages (AWPs). An AWP is an automated and dynamic presentation of the work package designed to guide the user through the work process. It is loaded on a mobile device, such as a tablet, and is capable of communicating with plant equipment and systems to acquire plant and procedure states. The AWP replaces those functions where a computer is more efficient and reliable than a human. To enable the automatic acquisition of plant data, it is necessary to design and develop a prototype platform for data exchange between the field instruments and the AWP mobile devices. The development of the platform aims to reveal issues and solutions generalizable to large-scale implementation of a similar system. Topics such as bandwidth, robustness, response time, interference, and security are usually associated with wireless communication. These concerns, along with other requirements, are listed in an earlier INL report. Specifically, the targeted issues and performance aspects in this work are relevant to the communication infrastructure from the perspective of promptness, robustness, expandability, and interoperability with different technologies.

  19. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  20. De novo automated design of small RNA circuits for engineering synthetic riboregulation in living cells

    PubMed Central

    Rodrigo, Guillermo; Landrain, Thomas E.; Jaramillo, Alfonso

    2012-01-01

    A grand challenge in synthetic biology is to use our current knowledge of RNA science to perform the automatic engineering of completely synthetic sequences encoding functional RNAs in living cells. We report here a fully automated design methodology and experimental validation of synthetic RNA interaction circuits working in a cellular environment. The computational algorithm, based on a physicochemical model, produces novel RNA sequences by exploring the space of possible sequences compatible with predefined structures. We tested our methodology in Escherichia coli by designing several positive riboregulators with diverse structures and interaction models, suggesting that only the energy of formation and the activation energy (free energy barrier to overcome for initiating the hybridization reaction) are sufficient criteria to engineer RNA interaction and regulation in bacteria. The designed sequences exhibit nonsignificant similarity to any known noncoding RNA sequence. Our riboregulatory devices work independently and in combination with transcription regulation to create complex logic circuits. Our results demonstrate that a computational methodology based on first-principles can be used to engineer interacting RNAs with allosteric behavior in living cells. PMID:22949707

  1. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  2. Using Formal Modeling With an Automated Analysis Tool to Design and Parametrically Analyze a Multirobot Coordination Protocol: A Case Study

    DTIC Science & Technology

    2007-05-01

    mode with predictable performance. However, requiring stable equilibria to lie in the given regions is difficult in all, but the simplest topological ... spaces . A game- theoretic approach to designing controllers for hybrid systems with a hierarchical structure is shown to be applicable to automated

  3. Beyond the Design of Automated Writing Evaluation: Pedagogical Practices and Perceived Learning Effectiveness in EFL Writing Classes

    ERIC Educational Resources Information Center

    Chen, Chi-Fen Emily; Cheng, Wei-Yuan Eugene

    2008-01-01

    Automated writing evaluation (AWE) software is designed to provide instant computer-generated scores for a submitted essay along with diagnostic feedback. Most studies on AWE have been conducted on psychometric evaluations of its validity; however, studies on how effectively AWE is used in writing classes as a pedagogical tool are limited. This…

  4. System Design and Development of a Robotic Device for Automated Venipuncture and Diagnostic Blood Cell Analysis

    PubMed Central

    Balter, Max L.; Chen, Alvin I.; Fromholtz, Alex; Gorshkov, Alex; Maguire, Tim J.; Yarmush, Martin L.

    2016-01-01

    Diagnostic blood testing is the most prevalent medical procedure performed in the world and forms the cornerstone of modern health care delivery. Yet blood tests are still predominantly carried out in centralized labs using large-volume samples acquired by manual venipuncture, and no end-to-end solution from blood draw to sample analysis exists today. Our group is developing a platform device that merges robotic phlebotomy with automated diagnostics to rapidly deliver patient information at the site of the blood draw. The system couples an image-guided venipuncture robot, designed to address the challenges of routine venous access, with a centrifuge-based blood analyzer to obtain quantitative measurements of hematology. In this paper, we first present the system design and architecture of the integrated device. We then perform a series of in vitro experiments to evaluate the cannulation accuracy of the system on blood vessel phantoms. Next, we assess the effects of vessel diameter, needle gauge, flow rate, and viscosity on the rate of sample collection. Finally, we demonstrate proof-of-concept of a white cell assay on the blood analyzer using in vitro human samples spiked with fluorescently labeled microbeads. PMID:28239509

  5. Design of a Tool Integrating Force Sensing With Automated Insertion in Cochlear Implantation.

    PubMed

    Schurzig, Daniel; Labadie, Robert F; Hussong, Andreas; Rau, Thomas S; Webster, Robert J

    2012-04-01

    The quality of hearing restored to a deaf patient by a cochlear implant in hearing preservation cochlear implant surgery (and possibly also in routine cochlear implant surgery) is believed to depend on preserving delicate cochlear membranes while accurately inserting an electrode array deep into the spiral cochlea. Membrane rupture forces, and possibly, other indicators of suboptimal placement, are below the threshold detectable by human hands, motivating a force sensing insertion tool. Furthermore, recent studies have shown significant variability in manual insertion forces and velocities that may explain some instances of imperfect placement. Toward addressing this, an automated insertion tool was recently developed by Hussong et al. By following the same insertion tool concept, in this paper, we present mechanical enhancements that improve the surgeon's interface with the device and make it smaller and lighter. We also present electomechanical design of new components enabling integrated force sensing. The tool is designed to be sufficiently compact and light that it can be mounted to a microstereotactic frame for accurate image-guided preinsertion positioning. The new integrated force sensing system is capable of resolving forces as small as 0.005 N, and we provide experimental illustration of using forces to detect errors in electrode insertion.

  6. System Design and Development of a Robotic Device for Automated Venipuncture and Diagnostic Blood Cell Analysis.

    PubMed

    Balter, Max L; Chen, Alvin I; Fromholtz, Alex; Gorshkov, Alex; Maguire, Tim J; Yarmush, Martin L

    2016-10-01

    Diagnostic blood testing is the most prevalent medical procedure performed in the world and forms the cornerstone of modern health care delivery. Yet blood tests are still predominantly carried out in centralized labs using large-volume samples acquired by manual venipuncture, and no end-to-end solution from blood draw to sample analysis exists today. Our group is developing a platform device that merges robotic phlebotomy with automated diagnostics to rapidly deliver patient information at the site of the blood draw. The system couples an image-guided venipuncture robot, designed to address the challenges of routine venous access, with a centrifuge-based blood analyzer to obtain quantitative measurements of hematology. In this paper, we first present the system design and architecture of the integrated device. We then perform a series of in vitro experiments to evaluate the cannulation accuracy of the system on blood vessel phantoms. Next, we assess the effects of vessel diameter, needle gauge, flow rate, and viscosity on the rate of sample collection. Finally, we demonstrate proof-of-concept of a white cell assay on the blood analyzer using in vitro human samples spiked with fluorescently labeled microbeads.

  7. Current Practice in Designing Training for Complex Skills: Implications for Design and Evaluation of ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Schuver-van Blanken, Marian J.; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training-Interactive Tools is a European project coordinated by the Dutch National Aerospace Laboratory. The aim of ADAPT[IT] is to create and validate an effective training design methodology, based on cognitive science and leading to the integration of advanced technologies, so that the…

  8. A Practical Approach for Integrating Automatically Designed Fixtures with Automated Assembly Planning

    SciTech Connect

    Calton, Terri L.; Peters, Ralph R.

    1999-07-20

    This paper presents a practical approach for integrating automatically designed fixtures with automated assembly planning. Product assembly problems vary widely; here the focus is on assemblies that are characterized by a single base part to which a number of smaller parts and subassemblies are attached. This method starts with three-dimension at CAD descriptions of an assembly whose assembly tasks require a fixture to hold the base part. It then combines algorithms that automatically design assembly pallets to hold the base part with algorithms that automatically generate assembly sequences. The designed fixtures rigidly constrain and locate the part, obey task constraints, are robust to part shape variations, are easy to load, and are economical to produce. The algorithm is guaranteed to find the global optimum solution that satisfies these and other pragmatic conditions. The assembly planner consists of four main elements: a user interface, a constraint system, a search engine, and an animation module. The planner expresses all constraints at a sequencing level, specifying orders and conditions on part mating operations in a number of ways. Fast replanning enables an interactive plan-view-constrain-replan cycle that aids in constrain discovery and documentation. The combined algorithms guarantee that the fixture will hold the base part without interfering with any of the assembly operations. This paper presents an overview of the planners, the integration approach, and the results of the integrated algorithms applied to several practical manufacturing problems. For these problems initial high-quality fixture designs and assembly sequences are generated in a matter of minutes with global optimum solutions identified in just over an hour.

  9. Adaptive filter design using recurrent cerebellar model articulation controller.

    PubMed

    Lin, Chih-Min; Chen, Li-Yang; Yeung, Daniel S

    2010-07-01

    A novel adaptive filter is proposed using a recurrent cerebellar-model-articulation-controller (CMAC). The proposed locally recurrent globally feedforward recurrent CMAC (RCMAC) has favorable properties of small size, good generalization, rapid learning, and dynamic response, thus it is more suitable for high-speed signal processing. To provide fast training, an efficient parameter learning algorithm based on the normalized gradient descent method is presented, in which the learning rates are on-line adapted. Then the Lyapunov function is utilized to derive the conditions of the adaptive learning rates, so the stability of the filtering error can be guaranteed. To demonstrate the performance of the proposed adaptive RCMAC filter, it is applied to a nonlinear channel equalization system and an adaptive noise cancelation system. The advantages of the proposed filter over other adaptive filters are verified through simulations.

  10. Accelerated search for materials with targeted properties by adaptive design

    PubMed Central

    Xue, Dezhen; Balachandran, Prasanna V.; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-01-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set. PMID:27079901

  11. Designing for Change: Interoperability in a scaling and adapting environment

    NASA Astrophysics Data System (ADS)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  12. Optical Design and Optimization of Translational Reflective Adaptive Optics Ophthalmoscopes

    NASA Astrophysics Data System (ADS)

    Sulai, Yusufu N. B.

    The retina serves as the primary detector for the biological camera that is the eye. It is composed of numerous classes of neurons and support cells that work together to capture and process an image formed by the eye's optics, which is then transmitted to the brain. Loss of sight due to retinal or neuro-ophthalmic disease can prove devastating to one's quality of life, and the ability to examine the retina in vivo is invaluable in the early detection and monitoring of such diseases. Adaptive optics (AO) ophthalmoscopy is a promising diagnostic tool in early stages of development, still facing significant challenges before it can become a clinical tool. The work in this thesis is a collection of projects with the overarching goal of broadening the scope and applicability of this technology. We begin by providing an optical design approach for AO ophthalmoscopes that reduces the aberrations that degrade the performance of the AO correction. Next, we demonstrate how to further improve image resolution through the use of amplitude pupil apodization and non-common path aberration correction. This is followed by the development of a viewfinder which provides a larger field of view for retinal navigation. Finally, we conclude with the development of an innovative non-confocal light detection scheme which improves the non-invasive visualization of retinal vasculature and reveals the cone photoreceptor inner segments in healthy and diseased eyes.

  13. Designing Adaptive Instructional Environments: Insights from Empirical Evidence

    DTIC Science & Technology

    2011-10-01

    Similarly, Shute and Zapata -Rivera (2008) define adaptivity as the capability of a system to alter its behavior according to learner needs and...60(2), 265-306. Landsberg, C R., Van Buskirk, W. L., Astwood Jr., R. A., Mercado , A. D., & Aakre, A. J. (2010). Adaptive training considerations...22, 77-92. Shute, V. J., & Zapata -Rivera, D. (2008). Adaptive technologies. In J. M. Spector, M. D. Merril, J. J. G. van Merriënboer, & M. Driscoll

  14. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  15. Program user's manual for optimizing the design of a liquid or gaseous propellant rocket engine with the automated combustor design code AUTOCOM

    NASA Technical Reports Server (NTRS)

    Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.

    1973-01-01

    This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.

  16. Integrated development process for automated or driver-assist vehicles

    NASA Astrophysics Data System (ADS)

    Borges de Sousa, Joao; Misener, James A.; Sengupta, Raja

    2002-07-01

    We describe the methodology, tools and technologies for designing and implementing communication and control systems for networked automated or driver assist vehicles. In addressing design, we discuss enabling methodologies and our suite of enabling computational tools for formal modeling, simulation, and implementation. We illustrate our description with design, development and implementation work we have performed for Automated Highway Systems, Autonomous Underwater Vehicles, Mobile Offshore Base, Unmanned Air Vehicles, and Cooperative Adaptive Cruise Control. We conclude with the assertion - borne from our experience - that ground vehicle systems with any degree of automated operation could benefit from the type of integrated development process that we describe.

  17. Time domain and frequency domain design techniques for model reference adaptive control systems

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III

    1971-01-01

    Some problems associated with the design of model-reference adaptive control systems are considered and solutions to these problems are advanced. The stability of the adapted system is a primary consideration in the development of both the time-domain and the frequency-domain design techniques. Consequentially, the use of Liapunov's direct method forms an integral part of the derivation of the design procedures. The application of sensitivity coefficients to the design of model-reference adaptive control systems is considered. An application of the design techniques is also presented.

  18. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  19. A novel automated instrument designed to determine photosensitivity thresholds (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Aguilar, Mariela C.; Gonzalez, Alex; Rowaan, Cornelis; De Freitas, Carolina; Rosa, Potyra R.; Alawa, Karam; Lam, Byron L.; Parel, Jean-Marie A.

    2016-03-01

    As there is no clinically available instrument to systematically and reliably determine the photosensitivity thresholds of patients with dry eyes, blepharospasms, migraines, traumatic brain injuries, and genetic disorders such as Achromatopsia, retinitis pigmentosa and other retinal dysfunctions, a computer-controlled optoelectronics system was designed. The BPEI Photosensitivity System provides a light stimuli emitted from a bi-cupola concave, 210 white LED array with varying intensity ranging from 1 to 32,000 lux. The system can either utilize a normal or an enhanced testing mode for subjects with low light tolerance. The automated instrument adjusts the intensity of each light stimulus. The subject is instructed to indicate discomfort by pressing a hand-held button. Reliability of the responses is tracked during the test. The photosensitivity threshold is then calculated after 10 response reversals. In a preliminary study, we demonstrated that subjects suffering from Achromatopsia experienced lower photosensitivity thresholds than normal subjects. Hence, the system can safely and reliably determine the photosensitivity thresholds of healthy and light sensitive subjects by detecting and quantifying the individual differences. Future studies will be performed with this system to determine the photosensitivity threshold differences between normal subjects and subjects suffering from other conditions that affect light sensitivity.

  20. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers

    PubMed Central

    Espah Borujeni, Amin; Mishler, Dennis M.; Wang, Jingzhi; Huso, Walker; Salis, Howard M.

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription–translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  1. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers.

    PubMed

    Espah Borujeni, Amin; Mishler, Dennis M; Wang, Jingzhi; Huso, Walker; Salis, Howard M

    2016-01-08

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription-translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications.

  2. Design and implementation of a highly integrated and automated in situ bioremediation system for petroleum hydrocarbons

    SciTech Connect

    Dey, J.C.; Rosenwinkel, P.; Norris, R.D.

    1996-12-31

    The proposed sale of an industrial property required that an environmental investigation be conducted as part of the property transfer agreement. The investigation revealed petroleum hydrocarbon compounds (PHCs) in the subsurface. Light nonaqueous phase liquids (LNAPLs) varsol (a gasoline like solvent), gasoline, and fuel oil were found across a three (3) acre area and were present as liquid phase PHCs, as dissolved phase PHCs, and as adsorbed phase PHCs in both saturated and unsaturated soils. Fuel oil was largely present in the unsaturated soils. Fuel oil was largely present in the unsaturated soils. Varsol represented the majority of the PHCs present. The presence of liquid phase PHCs suggested that any remedial action incorporate free phase recovery. The volatility of varsol and gasoline and the biodegradability of the PHCs present in the subsurface suggested that bioremediation, air sparging, and soil vapor extraction/bioventing were appropriate technologies for incorporation in a remedy. The imminent conversion of the impacted area to a retail facility required that any long term remedy be unobtrusive and require minimum activity across much of the impacted area. In the following sections the site investigation, selection and testing of remedial technologies, and design and implementation of an integrated and automated remedial system is discussed.

  3. An adaptive optics imaging system designed for clinical use.

    PubMed

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R; Rossi, Ethan A

    2015-06-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2-3 arc minutes, (arcmin) 2) ~0.5-0.8 arcmin and, 3) ~0.05-0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3-5 arcmin, 2) ~0.7-1.1 arcmin and 3) ~0.07-0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing.

  4. An adaptive optics imaging system designed for clinical use

    PubMed Central

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R.; Rossi, Ethan A.

    2015-01-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2–3 arc minutes, (arcmin) 2) ~0.5–0.8 arcmin and, 3) ~0.05–0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3–5 arcmin, 2) ~0.7–1.1 arcmin and 3) ~0.07–0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing. PMID:26114033

  5. Adaptive design clinical trials and trial logistics models in CNS drug development.

    PubMed

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2011-02-01

    In central nervous system therapeutic areas, there are general concerns with establishing efficacy thought to be sources of high attrition rate in drug development. For instance, efficacy endpoints are often subjective and highly variable. There is a lack of robust or operational biomarkers to substitute for soft endpoints. In addition, animal models are generally poor, unreliable or unpredictive. To increase the probability of success in central nervous system drug development program, adaptive design has been considered as an alternative designs that provides flexibility to the conventional fixed designs and has been viewed to have the potential to improve the efficiency in drug development processes. In addition, successful implementation of an adaptive design trial relies on establishment of a trustworthy logistics model that ensures integrity of the trial conduct. In accordance with the spirit of the U.S. Food and Drug Administration adaptive design draft guidance document recently released, this paper enlists the critical considerations from both methodological aspects and regulatory aspects in reviewing an adaptive design proposal and discusses two general types of adaptations, sample size planning and re-estimation, and two-stage adaptive design. Literature examples of adaptive designs in central nervous system are used to highlight the principles laid out in the U.S. FDA draft guidance. Four logistics models seen in regulatory adaptive design applications are introduced. In general, complex adaptive designs require simulation studies to access the design performance. For an adequate and well-controlled clinical trial, if a Learn-and-Confirm adaptive selection approach is considered, the study-wise type I error rate should be adhered to. However, it is controversial to use the simulated type I error rate to address a strong control of the study-wise type I error rate.

  6. Design of an Adaptive Human-Machine System Based on Dynamical Pattern Recognition of Cognitive Task-Load.

    PubMed

    Zhang, Jianhua; Yin, Zhong; Wang, Rubin

    2017-01-01

    This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed.

  7. Design of an Adaptive Human-Machine System Based on Dynamical Pattern Recognition of Cognitive Task-Load

    PubMed Central

    Zhang, Jianhua; Yin, Zhong; Wang, Rubin

    2017-01-01

    This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed. PMID:28367110

  8. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  9. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  10. Issues and Challenges in the Design of Culturally Adapted Evidence-Based Interventions

    PubMed Central

    Castro, Felipe González; Barrera, Manuel; Holleran Steiker, Lori K.

    2014-01-01

    This article examines issues and challenges in the design of cultural adaptations that are developed from an original evidence-based intervention (EBI). Recently emerging multistep frameworks or stage models are examined, as these can systematically guide the development of culturally adapted EBIs. Critical issues are also presented regarding whether and how such adaptations may be conducted, and empirical evidence is presented regarding the effectiveness of such cultural adaptations. Recent evidence suggests that these cultural adaptations are effective when applied with certain subcultural groups, although they are less effective when applied with other subcultural groups. Generally, current evidence regarding the effectiveness of cultural adaptations is promising but mixed. Further research is needed to obtain more definitive conclusions regarding the efficacy and effectiveness of culturally adapted EBIs. Directions for future research and recommendations are presented to guide the development of a new generation of culturally adapted EBIs. PMID:20192800

  11. Bayesian optimal response-adaptive design for binary responses using stopping rule.

    PubMed

    Komaki, Fumiyasu; Biswas, Atanu

    2016-05-02

    Response-adaptive designs are used in phase III clinical trials to allocate a larger number of patients to the better treatment arm. Optimal designs are explored in the recent years in the context of response-adaptive designs, in the frequentist view point only. In the present paper, we propose some response-adaptive designs for two treatments based on Bayesian prediction for phase III clinical trials. Some properties are studied and numerically compared with some existing competitors. A real data set is used to illustrate the applicability of the proposed methodology where we redesign the experiment using parameters derived from the data set.

  12. Design, Operation, and Maintenance of the Automated Rotation Control System for the 2.5-Meter Observa-Dome

    DTIC Science & Technology

    2013-02-01

    AFRL-RH-WP-TP-2013-0017 Design, Operation, and Maintenance of the Automated Rotation Control System for the 2.5-Meter Observa-Dome Judson...Government procurement does not in any way obligate the U.S. Government. The fact that the Government formulated or supplied the drawings, specifications...or other data does not license the holder or any other person or corporation; or convey any rights or permission to manufacture, use, or sell any

  13. Adapting Dam and Reservoir Design and Operations to Climate Change

    NASA Astrophysics Data System (ADS)

    Roy, René; Braun, Marco; Chaumont, Diane

    2013-04-01

    In order to identify the potential initiatives that the dam, reservoir and water resources systems owners and operators may undertake to cope with climate change issues, it is essential to determine the current state of knowledge of their impacts on hydrological variables at regional and local scales. Future climate scenarios derived from climate model simulations can be combined with operational hydrological modeling tools and historical observations to evaluate realistic pathways of future hydrological conditions for specific drainage basins. In the case of hydropower production those changes in hydrological conditions may have significant economic impacts. For over a decade the state owned hydropower producer Hydro Québec has been exploring the physical impacts on their watersheds by relying on climate services in collaboration with Ouranos, a consortium on regional climatology and adaptation to climate change. Previous climate change impact analysis had been including different sources of climate simulation data, explored different post-processing approaches and used hydrological impact models. At a new stage of this collaboration the operational management of Hydro Quebec aspired to carry out a cost-benefit analysis of considering climate change in the refactoring of hydro-power installations. In the process of the project not only a set of scenarios of future runoff regimes had to be defined to support long term planning decisions of a dam and reservoir operator, but also the significance of uncertainties needed to be communicated and made understood. We provide insight into a case study that took some unexpected turns and leaps by bringing together climate scientists, hydrologists and hydro-power operation managers. The study includes the selection of appropriate climate scenarios, the correction of biases, the application of hydrological models and the assessment of uncertainties. However, it turned out that communicating the science properly and

  14. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  15. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  16. Advanced Air Traffic Management Research (Human Factors and Automation): NASA Research Initiatives in Human-Centered Automation Design in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Condon, Gregory W. (Technical Monitor)

    1996-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. The core processes of control and the distribution of decision making in that control are undergoing extensive analysis. From our perspective, the human operators and the procedures by which they interact are the fundamental determinants of the safe, efficient, and flexible operation of the system. In that perspective, we have begun to explore what our experience has taught will be the most challenging aspects of designing and integrating human-centered automation in the advanced system. We have performed a full mission simulation looking at the role shift to self-separation on board the aircraft with the rules of the air guiding behavior and the provision of a cockpit display of traffic information and an on-board traffic alert system that seamlessly integrates into the TCAS operations. We have performed and initial investigation of the operational impact of "Dynamic Density" metrics on controller relinquishing and reestablishing full separation authority. (We follow the assumption that responsibility at all times resides with the controller.) This presentation will describe those efforts as well as describe the process by which we will guide the development of error tolerant systems that are sensitive to shifts in operator work load levels and dynamic shifts in the operating point of air traffic management.

  17. An Automatic Online Calibration Design in Adaptive Testing

    ERIC Educational Resources Information Center

    Makransky, Guido; Glas, Cees A. W.

    2010-01-01

    An accurately calibrated item bank is essential for a valid computerized adaptive test. However, in some settings, such as occupational testing, there is limited access to test takers for calibration. As a result of the limited access to possible test takers, collecting data to accurately calibrate an item bank in an occupational setting is…

  18. Evolving RBF neural networks for adaptive soft-sensor design.

    PubMed

    Alexandridis, Alex

    2013-12-01

    This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.

  19. Design and implementation of adaptive PI control schemes for web tension control in roll-to-roll (R2R) manufacturing.

    PubMed

    Raul, Pramod R; Pagilla, Prabhakar R

    2015-05-01

    In this paper, two adaptive Proportional-Integral (PI) control schemes are designed and discussed for control of web tension in Roll-to-Roll (R2R) manufacturing systems. R2R systems are used to transport continuous materials (called webs) on rollers from the unwind roll to the rewind roll. Maintaining web tension at the desired value is critical to many R2R processes such as printing, coating, lamination, etc. Existing fixed gain PI tension control schemes currently used in industrial practice require extensive tuning and do not provide the desired performance for changing operating conditions and material properties. The first adaptive PI scheme utilizes the model reference approach where the controller gains are estimated based on matching of the actual closed-loop tension control systems with an appropriately chosen reference model. The second adaptive PI scheme utilizes the indirect adaptive control approach together with relay feedback technique to automatically initialize the adaptive PI gains. These adaptive tension control schemes can be implemented on any R2R manufacturing system. The key features of the two adaptive schemes is that their designs are simple for practicing engineers, easy to implement in real-time, and automate the tuning process. Extensive experiments are conducted on a large experimental R2R machine which mimics many features of an industrial R2R machine. These experiments include trials with two different polymer webs and a variety of operating conditions. Implementation guidelines are provided for both adaptive schemes. Experimental results comparing the two adaptive schemes and a fixed gain PI tension control scheme used in industrial practice are provided and discussed.

  20. Automation of a primer design and evaluation pipeline for subsequent sequencing of the coding regions of all human Refseq genes

    PubMed Central

    Lai, Daniel; Love, Donald R

    2012-01-01

    Screening for mutations in human disease-causing genes in a molecular diagnostic environment demands simplicity with a view to allowing high throughput approaches. In order to advance these requirements, we have developed and applied a primer design program, termed BatchPD, to achieve the PCR amplification of coding exons of all known human Refseq genes. Primer design, in silico PCR checks and formatted primer information for subsequent web-based interrogation are queried from existing online tools. BatchPD acts as an intermediate to automate queries and results processing and provides exon-specific information that is summarised in a spreadsheet format. PMID:22570517

  1. Context-Adaptive Learning Designs by Using Semantic Web Services

    ERIC Educational Resources Information Center

    Dietze, Stefan; Gugliotta, Alessio; Domingue, John

    2007-01-01

    IMS Learning Design (IMS-LD) is a promising technology aimed at supporting learning processes. IMS-LD packages contain the learning process metadata as well as the learning resources. However, the allocation of resources--whether data or services--within the learning design is done manually at design-time on the basis of the subjective appraisals…

  2. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis

    PubMed Central

    Vasdev, Neil; Collier, Thomas Lee

    2016-01-01

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer. PMID:27548189

  3. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.

    PubMed

    Vasdev, Neil; Collier, Thomas Lee

    2016-08-17

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer.

  4. Equipment development for automated assembly of solar modules

    NASA Technical Reports Server (NTRS)

    Hagerty, J. J.

    1982-01-01

    Prototype equipment was developed which allows for totally automated assembly in the three major areas of module manufacture: cell stringing, encapsulant layup and cure and edge sealing. The equipment is designed to be used in conjunction with a standard Unimate 2000B industrial robot although the design is adaptable to other transport systems.

  5. Decentralized adaptive control of robot manipulators with robust stabilization design

    NASA Technical Reports Server (NTRS)

    Yuan, Bau-San; Book, Wayne J.

    1988-01-01

    Due to geometric nonlinearities and complex dynamics, a decentralized technique for adaptive control for multilink robot arms is attractive. Lyapunov-function theory for stability analysis provides an approach to robust stabilization. Each joint of the arm is treated as a component subsystem. The adaptive controller is made locally stable with servo signals including proportional and integral gains. This results in the bound on the dynamical interactions with other subsystems. A nonlinear controller which stabilizes the system with uniform boundedness is used to improve the robustness properties of the overall system. As a result, the robot tracks the reference trajectories with convergence. This strategy makes computation simple and therefore facilitates real-time implementation.

  6. A Web-Based Adaptive Tutor to Teach PCR Primer Design

    ERIC Educational Resources Information Center

    van Seters, Janneke R.; Wellink, Joan; Tramper, Johannes; Goedhart, Martin J.; Ossevoort, Miriam A.

    2012-01-01

    When students have varying prior knowledge, personalized instruction is desirable. One way to personalize instruction is by using adaptive e-learning to offer training of varying complexity. In this study, we developed a web-based adaptive tutor to teach PCR primer design: the PCR Tutor. We used part of the Taxonomy of Educational Objectives (the…

  7. Towards Individualized Online Learning: The Design and Development of an Adaptive Web Based Learning Environment

    ERIC Educational Resources Information Center

    Inan, Fethi A.; Flores, Raymond; Ari, Fatih; Arslan-Ari, Ismahan

    2011-01-01

    The purpose of this study was to document the design and development of an adaptive system which individualizes instruction such as content, interfaces, instructional strategies, and resources dependent on two factors, namely student motivation and prior knowledge levels. Combining adaptive hypermedia methods with strategies proposed by…

  8. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  9. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  10. Response-Adaptive Decision-Theoretic Trial Design: Operating Characteristics and Ethics

    PubMed Central

    Lipsky, Ari M.; Lewis, Roger J.

    2013-01-01

    Adaptive randomization is used in clinical trials to increase statistical efficiency. In addition, some clinicians and researchers believe that using adaptive randomization leads necessarily to more ethical treatment of subjects in a trial. We develop Bayesian, decision-theoretic, clinical trial designs with response-adaptive randomization and a primary goal of estimating treatment effect, and then contrast these designs with designs that also include in their loss function a cost for poor subject outcome. When the loss function did not incorporate a cost for poor subject outcome, the gains in efficiency from response-adaptive randomization were accompanied by ethically concerning subject allocations. Conversely, including a cost for poor subject outcome demonstrated a more acceptable balance between the competing needs in the trial. A subsequent, parallel set of trials designed to control explicitly type I and II error rates showed that much of the improvement achieved through modification of the loss function was essentially negated. Therefore, gains in efficiency from the use of a decision-theoretic, response-adaptive design using adaptive randomization may only be assumed to apply to those goals which are explicitly included in the loss function. Trial goals, including ethical ones, which do not appear in the loss function are ignored and may even be compromised; it is thus inappropriate to assume that all adaptive trials are necessarily more ethical. Controlling type I and II error rates largely negates the benefit of including competing needs in favor of the goal of parameter estimation. PMID:23558674

  11. Design, Development, and Testing of Software for Automation of a Naval Tactical Aviation Squadron.

    DTIC Science & Technology

    1986-09-01

    Management System (SIMS) is flexible, supportable, and transportable. The SIMS will help slay the paper dragon in the TACAIR community, letting the pilots...automate the Operations department of an A-7 or F/A-18 squadron. Subsequent additions may extend to other departments. The Squadron Information

  12. Design of an Automated Essay Grading (AEG) System in Indian Context

    ERIC Educational Resources Information Center

    Ghosh, Siddhartha; Fatima, Sameen S.

    2007-01-01

    Automated essay grading or scoring systems are no more a myth, but they are a reality. As of today, the human written (not hand written) essays are corrected not only by examiners/teachers but also by machines. The TOEFL exam is one of the best examples of this application. The students' essays are evaluated both by human and web based automated…

  13. Design of fuzzy system by NNs and realization of adaptability

    NASA Technical Reports Server (NTRS)

    Takagi, Hideyuki

    1993-01-01

    The issue of designing and tuning fuzzy membership functions by neural networks (NN's) was started by NN-driven Fuzzy Reasoning in 1988. NN-driven fuzzy reasoning involves a NN embedded in the fuzzy system which generates membership values. In conventional fuzzy system design, the membership functions are hand-crafted by trial and error for each input variable. In contrast, NN-driven fuzzy reasoning considers several variables simultaneously and can design a multidimensional, nonlinear membership function for the entire subspace.

  14. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  15. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer

    NASA Astrophysics Data System (ADS)

    Luiza Bondar, M.; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  16. Load-Adapted Design of Generative Manufactured Lattice Structures

    NASA Astrophysics Data System (ADS)

    Reinhart, Gunther; Teufelhart, Stefan

    Additive layer manufacturing offers many opportunities for the production of lightweight components, because of the high geometrical freedom that can be realized in comparison to conventional manufacturing processes. This potential gets demonstrated at the example of a bending beam. Therefore, a topology optimization is performed as well as the use of periodically arranged lattice structures. The latter ones show the constraint, that shear forces in the struts reduce the stiffness of the lattice. To avoid this, the structure has to be adapted to the flux of force. This thesis is supported by studies on a torqueloaded shaft.

  17. Adapting Cognitive Walkthrough to Support Game Based Learning Design

    ERIC Educational Resources Information Center

    Farrell, David; Moffat, David C.

    2014-01-01

    For any given Game Based Learning (GBL) project to be successful, the player must learn something. Designers may base their work on pedagogical research, but actual game design is still largely driven by intuition. People are famously poor at unsupported methodical thinking and relying so much on instinct is an obvious weak point in GBL design…

  18. Adapting the Mathematical Task Framework to Design Online Didactic Objects

    ERIC Educational Resources Information Center

    Bowers, Janet; Bezuk, Nadine; Aguilar, Karen

    2011-01-01

    Designing didactic objects involves imagining how students can conceive of specific mathematical topics and then imagining what types of classroom discussions could support these mental constructions. This study investigated whether it was possible to design Java applets that might serve as didactic objects to support online learning where…

  19. Accuracy of software designed for automated localization of the inferior alveolar nerve canal on cone beam CT images

    PubMed Central

    Zamani, Ali; Kashkouli, Sadegh; Soltanimehr, Elham; Ghofrani Jahromi, Mohsen; Sanaeian Pourshirazi, Zahra

    2016-01-01

    Objectives: The aim of this study was to design and evaluate a new method for automated localization of the inferior alveolar nerve canal on CBCT images. Methods: The proposed method is based on traversing both panoramic and cross-sectional slices. For the panoramic slices, morphological skeletonization is imposed, and a modified Hough transform is used while traversing the cross-sectional slices. A total of 40 CBCT images were randomly selected. Two experts twice located the inferior alveolar nerve canal during two examinations set 6 weeks apart. Agreement between experts was achieved, and the result of this manual technique was considered the gold standard for our study. The distances for the automated method and those determined using the gold standard method were calculated and recorded. The mean time required for the automated detection was also recorded. Results: The average mean distance error from the baseline was 0.75 ± 0.34 mm. In all, 86% of the detected points had a mean error of <1 mm compared with those determined by the manual gold standard method. Conclusions: The proposed method is far more accurate and faster than previous methods. It also provides more accuracy than human annotation within a shorter time. PMID:26652929

  20. Using engineering control principles to inform the design of adaptive interventions: a conceptual introduction.

    PubMed

    Rivera, Daniel E; Pew, Michael D; Collins, Linda M

    2007-05-01

    The goal of this paper is to describe the role that control engineering principles can play in developing and improving the efficacy of adaptive, time-varying interventions. It is demonstrated that adaptive interventions constitute a form of feedback control system in the context of behavioral health. Consequently, drawing from ideas in control engineering has the potential to significantly inform the analysis, design, and implementation of adaptive interventions, leading to improved adherence, better management of limited resources, a reduction of negative effects, and overall more effective interventions. This article illustrates how to express an adaptive intervention in control engineering terms, and how to use this framework in a computer simulation to investigate the anticipated impact of intervention design choices on efficacy. The potential benefits of operationalizing decision rules based on control engineering principles are particularly significant for adaptive interventions that involve multiple components or address co-morbidities, situations that pose significant challenges to conventional clinical practice.

  1. Design of smart composite platforms for adaptive trust vector control and adaptive laser telescope for satellite applications

    NASA Astrophysics Data System (ADS)

    Ghasemi-Nejhad, Mehrdad N.

    2013-04-01

    This paper presents design of smart composite platforms for adaptive trust vector control (TVC) and adaptive laser telescope for satellite applications. To eliminate disturbances, the proposed adaptive TVC and telescope systems will be mounted on two analogous smart composite platform with simultaneous precision positioning (pointing) and vibration suppression (stabilizing), SPPVS, with micro-radian pointing resolution, and then mounted on a satellite in two different locations. The adaptive TVC system provides SPPVS with large tip-tilt to potentially eliminate the gimbals systems. The smart composite telescope will be mounted on a smart composite platform with SPPVS and then mounted on a satellite. The laser communication is intended for the Geosynchronous orbit. The high degree of directionality increases the security of the laser communication signal (as opposed to a diffused RF signal), but also requires sophisticated subsystems for transmission and acquisition. The shorter wavelength of the optical spectrum increases the data transmission rates, but laser systems require large amounts of power, which increases the mass and complexity of the supporting systems. In addition, the laser communication on the Geosynchronous orbit requires an accurate platform with SPPVS capabilities. Therefore, this work also addresses the design of an active composite platform to be used to simultaneously point and stabilize an intersatellite laser communication telescope with micro-radian pointing resolution. The telescope is a Cassegrain receiver that employs two mirrors, one convex (primary) and the other concave (secondary). The distance, as well as the horizontal and axial alignment of the mirrors, must be precisely maintained or else the optical properties of the system will be severely degraded. The alignment will also have to be maintained during thruster firings, which will require vibration suppression capabilities of the system as well. The innovative platform has been

  2. Designing Forest Adaptation Experiments through Manager-Scientist Partnerships

    NASA Astrophysics Data System (ADS)

    Nagel, L. M.; Swanston, C.; Janowiak, M.

    2014-12-01

    Three common forest adaptation options discussed in the context of an uncertain future climate are: creating resistance, promoting resilience, and enabling forests to respond to change. Though there is consensus on the broad management goals addressed by each of these options, translating these concepts into management plans specific for individual forest types that vary in structure, composition, and function remains a challenge. We will describe a decision-making framework that we employed within a manager-scientist partnership to develop a suite of adaptation treatments for two contrasting forest types as part of a long-term forest management experiment. The first, in northern Minnesota, is a red pine-dominated forest with components of white pine, aspen, paper birch, and northern red oak, with a hazel understory. The second, in southwest Colorado, is a warm-dry mixed conifer forest dominated by ponderosa pine, white fir, and Douglas-fir, with scattered aspen and an understory of Gambel oak. The current conditions at both sites are characterized by overstocking with moderate-to-high fuel loading, vulnerability to numerous forest health threats, and are generally uncharacteristic of historic structure and composition. The desired future condition articulated by managers for each site included elements of historic structure and natural range of variability, but were greatly tempered by known vulnerabilities and projected changes to climate and disturbance patterns. The resultant range of treatments we developed are distinct for each forest type, and address a wide range of management objectives.

  3. Adaptive optoelectronic camouflage systems with designs inspired by cephalopod skins.

    PubMed

    Yu, Cunjiang; Li, Yuhang; Zhang, Xun; Huang, Xian; Malyarchuk, Viktor; Wang, Shuodao; Shi, Yan; Gao, Li; Su, Yewang; Zhang, Yihui; Xu, Hangxun; Hanlon, Roger T; Huang, Yonggang; Rogers, John A

    2014-09-09

    Octopus, squid, cuttlefish, and other cephalopods exhibit exceptional capabilities for visually adapting to or differentiating from the coloration and texture of their surroundings, for the purpose of concealment, communication, predation, and reproduction. Long-standing interest in and emerging understanding of the underlying ultrastructure, physiological control, and photonic interactions has recently led to efforts in the construction of artificial systems that have key attributes found in the skins of these organisms. Despite several promising options in active materials for mimicking biological color tuning, existing routes to integrated systems do not include critical capabilities in distributed sensing and actuation. Research described here represents progress in this direction, demonstrated through the construction, experimental study, and computational modeling of materials, device elements, and integration schemes for cephalopod-inspired flexible sheets that can autonomously sense and adapt to the coloration of their surroundings. These systems combine high-performance, multiplexed arrays of actuators and photodetectors in laminated, multilayer configurations on flexible substrates, with overlaid arrangements of pixelated, color-changing elements. The concepts provide realistic routes to thin sheets that can be conformally wrapped onto solid objects to modulate their visual appearance, with potential relevance to consumer, industrial, and military applications.

  4. Adaptive optoelectronic camouflage systems with designs inspired by cephalopod skins

    PubMed Central

    Yu, Cunjiang; Li, Yuhang; Zhang, Xun; Huang, Xian; Malyarchuk, Viktor; Wang, Shuodao; Shi, Yan; Gao, Li; Su, Yewang; Zhang, Yihui; Xu, Hangxun; Hanlon, Roger T.; Huang, Yonggang; Rogers, John A.

    2014-01-01

    Octopus, squid, cuttlefish, and other cephalopods exhibit exceptional capabilities for visually adapting to or differentiating from the coloration and texture of their surroundings, for the purpose of concealment, communication, predation, and reproduction. Long-standing interest in and emerging understanding of the underlying ultrastructure, physiological control, and photonic interactions has recently led to efforts in the construction of artificial systems that have key attributes found in the skins of these organisms. Despite several promising options in active materials for mimicking biological color tuning, existing routes to integrated systems do not include critical capabilities in distributed sensing and actuation. Research described here represents progress in this direction, demonstrated through the construction, experimental study, and computational modeling of materials, device elements, and integration schemes for cephalopod-inspired flexible sheets that can autonomously sense and adapt to the coloration of their surroundings. These systems combine high-performance, multiplexed arrays of actuators and photodetectors in laminated, multilayer configurations on flexible substrates, with overlaid arrangements of pixelated, color-changing elements. The concepts provide realistic routes to thin sheets that can be conformally wrapped onto solid objects to modulate their visual appearance, with potential relevance to consumer, industrial, and military applications. PMID:25136094

  5. Twenty-five years of confirmatory adaptive designs: opportunities and pitfalls.

    PubMed

    Bauer, Peter; Bretz, Frank; Dragalin, Vladimir; König, Franz; Wassmer, Gernot

    2016-02-10

    'Multistage testing with adaptive designs' was the title of an article by Peter Bauer that appeared 1989 in the German journal Biometrie und Informatik in Medizin und Biologie. The journal does not exist anymore but the methodology found widespread interest in the scientific community over the past 25 years. The use of such multistage adaptive designs raised many controversial discussions from the beginning on, especially after the publication by Bauer and Köhne 1994 in Biometrics: Broad enthusiasm about potential applications of such designs faced critical positions regarding their statistical efficiency. Despite, or possibly because of, this controversy, the methodology and its areas of applications grew steadily over the years, with significant contributions from statisticians working in academia, industry and agencies around the world. In the meantime, such type of adaptive designs have become the subject of two major regulatory guidance documents in the US and Europe and the field is still evolving. Developments are particularly noteworthy in the most important applications of adaptive designs, including sample size reassessment, treatment selection procedures, and population enrichment designs. In this article, we summarize the developments over the past 25 years from different perspectives. We provide a historical overview of the early days, review the key methodological concepts and summarize regulatory and industry perspectives on such designs. Then, we illustrate the application of adaptive designs with three case studies, including unblinded sample size reassessment, adaptive treatment selection, and adaptive endpoint selection. We also discuss the availability of software for evaluating and performing such designs. We conclude with a critical review of how expectations from the beginning were fulfilled, and - if not - discuss potential reasons why this did not happen.

  6. Studying the neural bases of prism adaptation using fMRI: A technical and design challenge.

    PubMed

    Bultitude, Janet H; Farnè, Alessandro; Salemme, Romeo; Ibarrola, Danielle; Urquizar, Christian; O'Shea, Jacinta; Luauté, Jacques

    2016-12-30

    Prism adaptation induces rapid recalibration of visuomotor coordination. The neural mechanisms of prism adaptation have come under scrutiny since the observations that the technique can alleviate hemispatial neglect following stroke, and can alter spatial cognition in healthy controls. Relative to non-imaging behavioral studies, fMRI investigations of prism adaptation face several challenges arising from the confined physical environment of the scanner and the supine position of the participants. Any researcher who wishes to administer prism adaptation in an fMRI environment must adjust their procedures enough to enable the experiment to be performed, but not so much that the behavioral task departs too much from true prism adaptation. Furthermore, the specific temporal dynamics of behavioral components of prism adaptation present additional challenges for measuring their neural correlates. We developed a system for measuring the key features of prism adaptation behavior within an fMRI environment. To validate our configuration, we present behavioral (pointing) and head movement data from 11 right-hemisphere lesioned patients and 17 older controls who underwent sham and real prism adaptation in an MRI scanner. Most participants could adapt to prismatic displacement with minimal head movements, and the procedure was well tolerated. We propose recommendations for fMRI studies of prism adaptation based on the design-specific constraints and our results.

  7. Water Infrastructure Adaptation in New Urban Design: Possibilities and Constraints

    EPA Science Inventory

    Natural constraints, including climate change and dynamic socioeconomic development, can significantly impact the way we plan, design, and operate water infrastructure, thus its sustainability to deliver reliable quality water supplies and comply with environmental regulations. ...

  8. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  9. Automated Multiple-Sample Tray Manipulation Designed and Fabricated for Atomic Oxygen Facility

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Stueber, Thomas J.; Dever, Joyce A.; Banks, Bruce A.; Rutledge, Sharon K.

    2000-01-01

    Extensive improvements to increase testing capacity and flexibility and to automate the in situ Reflectance Measurement System (RMS) are in progress at the Electro-Physics Branch s Atomic Oxygen (AO) beam facility of the NASA Glenn Research Center at Lewis Field. These improvements will triple the system s capacity while placing a significant portion of the testing cycle under computer control for added reliability, repeatability, and ease of use.

  10. Regulatory perspectives on multiplicity in adaptive design clinical trials throughout a drug development program.

    PubMed

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2011-07-01

    A clinical research program for drug development often consists of a sequence of clinical trials that may begin with uncontrolled and nonrandomized trials, followed by randomized trials or randomized controlled trials. Adaptive designs are not infrequently proposed for use. In the regulatory setting, the success of a drug development program can be defined to be that the experimental treatment at a specific dose level including regimen and frequency is approved based on replicated evidence from at least two confirmatory trials. In the early stage of clinical research, multiplicity issues are very broad. What is the maximum tolerable dose in an adaptive dose escalation trial? What should the dose range be to consider in an adaptive dose-ranging trial? What is the minimum effective dose in an adaptive dose-response study given the tolerability and the toxicity observable in short term or premarketing trials? Is establishing the dose-response relationship important or the ability to select a superior treatment with high probability more important? In the later stage of clinical research, multiplicity problems can be formulated with better focus, depending on whether the study is for exploration to estimate or select design elements or for labeling consideration. What is the study objective for an early-phase versus a later phase adaptive clinical trial? How many doses are to be studied in the early exploratory adaptive trial versus in the confirmatory adaptive trial? Is the intended patient population well defined or is the applicable patient population yet to be adaptively selected in the trial due to the potential patient and/or disease heterogeneity? Is the primary efficacy endpoint well defined or still under discussion providing room for adaptation? What are the potential treatment indications that may adaptively lead to an intended-to-treat patient population and the primary efficacy endpoint? In this work we stipulate the multiplicity issues with adaptive

  11. Adaptive urn designs for estimating several percentiles of a dose--response curve.

    PubMed

    Mugno, Raymond; Zhus, Wei; Rosenberger, William F

    2004-07-15

    Dose--response experiments are crucial in biomedical studies. There are usually multiple objectives in such experiments and among the goals is the estimation of several percentiles on the dose--response curve. Here we present the first non-parametric adaptive design approach to estimate several percentiles simultaneously via generalized Pólya urns. Theoretical properties of these designs are investigated and their performance is gaged by the locally compound optimal designs. As an example, we re-investigated a psychophysical experiment where one of the goals was to estimate the three quartiles. We show that these multiple-objective adaptive designs are more efficient than the original single-objective adaptive design targeting the median only. We also show that urn designs which target the optimal designs are slightly more efficient than those which target the desired percentiles directly. Guidelines are given as to when to use which type of design. Overall we are pleased with the efficiency results and hope compound adaptive designs proposed in this work or their variants may prove to be a viable non-parametric alternative in multiple-objective dose--response studies.

  12. Optimal adaptive two-stage designs for early phase II clinical trials.

    PubMed

    Shan, Guogen; Wilding, Gregory E; Hutson, Alan D; Gerstenberger, Shawn

    2016-04-15

    Simon's optimal two-stage design has been widely used in early phase clinical trials for Oncology and AIDS studies with binary endpoints. With this approach, the second-stage sample size is fixed when the trial passes the first stage with sufficient activity. Adaptive designs, such as those due to Banerjee and Tsiatis (2006) and Englert and Kieser (2013), are flexible in the sense that the second-stage sample size depends on the response from the first stage, and these designs are often seen to reduce the expected sample size under the null hypothesis as compared with Simon's approach. An unappealing trait of the existing designs is that they are not associated with a second-stage sample size, which is a non-increasing function of the first-stage response rate. In this paper, an efficient intelligent process, the branch-and-bound algorithm, is used in extensively searching for the optimal adaptive design with the smallest expected sample size under the null, while the type I and II error rates are maintained and the aforementioned monotonicity characteristic is respected. The proposed optimal design is observed to have smaller expected sample sizes compared to Simon's optimal design, and the maximum total sample size of the proposed adaptive design is very close to that from Simon's method. The proposed optimal adaptive two-stage design is recommended for use in practice to improve the flexibility and efficiency of early phase therapeutic development.

  13. A modified varying-stage adaptive phase II/III clinical trial design.

    PubMed

    Dong, Gaohong; Vandemeulebroecke, Marc

    2016-07-01

    Conventionally, adaptive phase II/III clinical trials are carried out with a strict two-stage design. Recently, a varying-stage adaptive phase II/III clinical trial design has been developed. In this design, following the first stage, an intermediate stage can be adaptively added to obtain more data, so that a more informative decision can be made. Therefore, the number of further investigational stages is determined based upon data accumulated to the interim analysis. This design considers two plausible study endpoints, with one of them initially designated as the primary endpoint. Based on interim results, another endpoint can be switched as the primary endpoint. However, in many therapeutic areas, the primary study endpoint is well established. Therefore, we modify this design to consider one study endpoint only so that it may be more readily applicable in real clinical trial designs. Our simulations show that, the same as the original design, this modified design controls the Type I error rate, and the design parameters such as the threshold probability for the two-stage setting and the alpha allocation ratio in the two-stage setting versus the three-stage setting have a great impact on the design characteristics. However, this modified design requires a larger sample size for the initial stage, and the probability of futility becomes much higher when the threshold probability for the two-stage setting gets smaller. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Adaptive Pareto Set Estimation for Stochastic Mixed Variable Design Problems

    DTIC Science & Technology

    2009-03-01

    Improved Feature Extraction, Feature Selection, and Identification Techniques that Create a Fast Unsupervised Hyperspectral Target Detection Algorithm...optional): ______________________________________ General Category / Classification : [ ] core values [ ] command [ ] strategy...Optimization, 8(3), 631-657. 125 23. Davis, M. (2009). Using Multiple Robust Parameter Design Techniques to Improve Hyperspectral Anomaly

  15. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  16. Adapting Wood Technology to Teach Design and Engineering

    ERIC Educational Resources Information Center

    Rummel, Robert A.

    2012-01-01

    Technology education has changed dramatically over the last few years. The transition of industrial arts to technology education and more recently the pursuit of design and engineering has resulted in technology education teachers often needing to change their curriculum and course activities to meet the demands of a rapidly changing profession.…

  17. MURI: Adaptive Waveform Design for Full Spectral Dominance

    DTIC Science & Technology

    2011-03-11

    Glaser Steffen J; Luy Burkhard “Exploring the limits of broadband excitation and inversion: II. Rf-power optimized pulses,” Journal of magnetic resonance...Luy Burkhard ; Glaser Steffen J “ Linear phase slope in pulse design: application to coherence transfer,”Journal of magnetic resonance 2008; 192(2):235

  18. Design and analysis of closed-loop decoder adaptation algorithms for brain-machine interfaces.

    PubMed

    Dangi, Siddharth; Orsborn, Amy L; Moorman, Helene G; Carmena, Jose M

    2013-07-01

    Closed-loop decoder adaptation (CLDA) is an emerging paradigm for achieving rapid performance improvements in online brain-machine interface (BMI) operation. Designing an effective CLDA algorithm requires making multiple important decisions, including choosing the timescale of adaptation, selecting which decoder parameters to adapt, crafting the corresponding update rules, and designing CLDA parameters. These design choices, combined with the specific settings of CLDA parameters, will directly affect the algorithm's ability to make decoder parameters converge to values that optimize performance. In this article, we present a general framework for the design and analysis of CLDA algorithms and support our results with experimental data of two monkeys performing a BMI task. First, we analyze and compare existing CLDA algorithms to highlight the importance of four critical design elements: the adaptation timescale, selective parameter adaptation, smooth decoder updates, and intuitive CLDA parameters. Second, we introduce mathematical convergence analysis using measures such as mean-squared error and KL divergence as a useful paradigm for evaluating the convergence properties of a prototype CLDA algorithm before experimental testing. By applying these measures to an existing CLDA algorithm, we demonstrate that our convergence analysis is an effective analytical tool that can ultimately inform and improve the design of CLDA algorithms.

  19. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  20. Development and Design of a User Interface for a Computer Automated Heating, Ventilation, and Air Conditioning System

    SciTech Connect

    Anderson, B.; /Fermilab

    1999-10-08

    A user interface is created to monitor and operate the heating, ventilation, and air conditioning system. The interface is networked to the system's programmable logic controller. The controller maintains automated control of the system. The user through the interface is able to see the status of the system and override or adjust the automatic control features. The interface is programmed to show digital readouts of system equipment as well as visual queues of system operational statuses. It also provides information for system design and component interaction. The interface is made easier to read by simple designs, color coordination, and graphics. Fermi National Accelerator Laboratory (Fermi lab) conducts high energy particle physics research. Part of this research involves collision experiments with protons, and anti-protons. These interactions are contained within one of two massive detectors along Fermilab's largest particle accelerator the Tevatron. The D-Zero Assembly Building houses one of these detectors. At this time detector systems are being upgraded for a second experiment run, titled Run II. Unlike the previous run, systems at D-Zero must be computer automated so operators do not have to continually monitor and adjust these systems during the run. Human intervention should only be necessary for system start up and shut down, and equipment failure. Part of this upgrade includes the heating, ventilation, and air conditioning system (HVAC system). The HVAC system is responsible for controlling two subsystems, the air temperatures of the D-Zero Assembly Building and associated collision hall, as well as six separate water systems used in the heating and cooling of the air and detector components. The BYAC system is automated by a programmable logic controller. In order to provide system monitoring and operator control a user interface is required. This paper will address methods and strategies used to design and implement an effective user interface

  1. Implementation of an Automated Grading System with an Adaptive Learning Component to Affect Student Feedback and Response Time

    ERIC Educational Resources Information Center

    Matthews, Kevin; Janicki, Thomas; He, Ling; Patterson, Laurie

    2012-01-01

    This research focuses on the development and implementation of an adaptive learning and grading system with a goal to increase the effectiveness and quality of feedback to students. By utilizing various concepts from established learning theories, the goal of this research is to improve the quantity, quality, and speed of feedback as it pertains…

  2. Group-Work in the Design of Complex Adaptive Learning Strategies

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    This paper presents a case study where twelve graduate students undertook the demanding role of the adaptive e-course developer and worked collaboratively on an authentic and complex design task in the context of open and distance tertiary education. The students had to work in groups in order to conceptualise and design a learning scenario for…

  3. Incorporation of Content Balancing Requirements in Stratification Designs for Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Leung, Chi-Keung; Chang, Hua-Hua; Hau, Kit-Tai

    2003-01-01

    Studied three stratification designs for computerized adaptive testing in conjunction with three well-developed content balancing methods. Simulation study results show substantial differences in item overlap rate and pool utilization among different methods. Recommends an optimal combination of stratification design and content balancing method.…

  4. A Framework for Adaptive Learning Design in a Web-Conferencing Environment

    ERIC Educational Resources Information Center

    Bower, Matt

    2016-01-01

    Many recent technologies provide the ability to dynamically adjust the interface depending on the emerging cognitive and collaborative needs of the learning episode. This means that educators can adaptively re-design the learning environment during the lesson, rather than purely relying on preemptive learning design thinking. Based on a…

  5. An adaptive Simon Two-Stage Design for Phase 2 studies of targeted therapies.

    PubMed

    Jones, Cheryl L; Holmgren, Eric

    2007-09-01

    The field of specialized medicine and clinical development programs for targeted cancer therapies are rapidly expanding. The proposed Phase 2 design allows for preliminary determination of efficacy that may be restricted to a particular sub-population defined by biomarker status (presence/absence). The design is an adaptation of the Simon Two-Stage Design. We provide examples where the adaptation can result in substantial savings in sample size and thus potentially lead to greater efficiency in decision making during the drug development process.

  6. A new approach for designing self-organizing systems and application to adaptive control

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Zhang, Shi; Lin, Yueqing; Huang, Song

    1993-01-01

    There is tremendous interest in the design of intelligent machines capable of autonomous learning and skillful performance under complex environments. A major task in designing such systems is to make the system plastic and adaptive when presented with new and useful information and stable in response to irrelevant events. A great body of knowledge, based on neuro-physiological concepts, has evolved as a possible solution to this problem. Adaptive resonance theory (ART) is a classical example under this category. The system dynamics of an ART network is described by a set of differential equations with nonlinear functions. An approach for designing self-organizing networks characterized by nonlinear differential equations is proposed.

  7. Simple adaptive control system design for a quadrotor with an internal PFC

    SciTech Connect

    Mizumoto, Ikuro; Nakamura, Takuto; Kumon, Makoto; Takagi, Taro

    2014-12-10

    The paper deals with an adaptive control system design problem for a four rotor helicopter or quadrotor. A simple adaptive control design scheme with a parallel feedforward compensator (PFC) in the internal loop of the considered quadrotor will be proposed based on the backstepping strategy. As is well known, the backstepping control strategy is one of the advanced control strategy for nonlinear systems. However, the control algorithm will become complex if the system has higher order relative degrees. We will show that one can skip some design steps of the backstepping method by introducing a PFC in the inner loop of the considered quadrotor, so that the structure of the obtained controller will be simplified and a high gain based adaptive feedback control system will be designed. The effectiveness of the proposed method will be confirmed through numerical simulations.

  8. Adaptive Clinical Trial Designs for Simultaneous Testing of Matched Diagnostics and Therapeutics

    PubMed Central

    Scher, Howard I.; Nasso, Shelley Fuld; Rubin, Eric H.; Simon, Richard

    2013-01-01

    A critical challenge in the development of new molecularly targeted anticancer drugs is the identification of predictive biomarkers and the concurrent development of diagnostics for these biomarkers. Developing matched diagnostics and therapeutics will require new clinical trial designs and methods of data analysis. The use of adaptive design in phase III trials may offer new opportunities for matched diagnosis and treatment because the size of the trial can allow for subpopulation analysis. We present an adaptive phase III trial design that can identify a suitable target population during the early course of the trial, enabling the efficacy of an experimental therapeutic to be evaluated within the target population as a later part of the same trial. The use of such an adaptive approach to clinical trial design has the potential to greatly improve the field of oncology and facilitate the development of personalized medicine. PMID:22046024

  9. On Adaptive Extended Compatibility Changing Type of Product Design Strategy

    NASA Astrophysics Data System (ADS)

    Wenwen, Jiang; Zhibin, Xie

    The article uses research ways of Enterprise localization and enterprise's development course to research strategy of company's product design and development. It announces at different stages for development, different kinds of enterprises will adopt product design and development policies of different modes. It also announces close causality between development course of company and central technology and product. The result indicated enterprises in leading position in market, technology and brand adopt pioneer strategy type of product research and development. These enterprise relying on the large-scale leading enterprise offering a complete set service adopts the passively duplicating type tactic of product research and development. Some enterprise in part of advantage in technology, market, management or brand adopt following up strategy of product research and development. The enterprises with relative advantage position adopt the strategy of technology applied taking optimizing services as centre in product research and development in fields of brand culture and market service.

  10. The Study and Design of Adaptive Learning System Based on Fuzzy Set Theory

    NASA Astrophysics Data System (ADS)

    Jia, Bing; Zhong, Shaochun; Zheng, Tianyang; Liu, Zhiyong

    Adaptive learning is an effective way to improve the learning outcomes, that is, the selection of learning content and presentation should be adapted to each learner's learning context, learning levels and learning ability. Adaptive Learning System (ALS) can provide effective support for adaptive learning. This paper proposes a new ALS based on fuzzy set theory. It can effectively estimate the learner's knowledge level by test according to learner's target. Then take the factors of learner's cognitive ability and preference into consideration to achieve self-organization and push plan of knowledge. This paper focuses on the design and implementation of domain model and user model in ALS. Experiments confirmed that the system providing adaptive content can effectively help learners to memory the content and improve their comprehension.

  11. Design and Preliminary Testing of the International Docking Adapter's Peripheral Docking Target

    NASA Technical Reports Server (NTRS)

    Foster, Christopher W.; Blaschak, Johnathan; Eldridge, Erin A.; Brazzel, Jack P.; Spehar, Peter T.

    2015-01-01

    The International Docking Adapter's Peripheral Docking Target (PDT) was designed to allow a docking spacecraft to judge its alignment relative to the docking system. The PDT was designed to be compatible with relative sensors using visible cameras, thermal imagers, or Light Detection and Ranging (LIDAR) technologies. The conceptual design team tested prototype designs and materials to determine the contrast requirements for the features. This paper will discuss the design of the PDT, the methodology and results of the tests, and the conclusions pertaining to PDT design that were drawn from testing.

  12. Adaptive fuzzy switched control design for uncertain nonholonomic systems with input nonsmooth constraint

    NASA Astrophysics Data System (ADS)

    Li, Yongming; Tong, Shaocheng

    2016-10-01

    In this paper, a fuzzy adaptive switched control approach is proposed for a class of uncertain nonholonomic chained systems with input nonsmooth constraint. In the control design, an auxiliary dynamic system is designed to address the input nonsmooth constraint, and an adaptive switched control strategy is constructed to overcome the uncontrollability problem associated with x0(t0) = 0. By using fuzzy logic systems to tackle unknown nonlinear functions, a fuzzy adaptive control approach is explored based on the adaptive backstepping technique. By constructing the combination approximation technique and using Young's inequality scaling technique, the number of the online learning parameters is reduced to n and the 'explosion of complexity' problem is avoid. It is proved that the proposed method can guarantee that all variables of the closed-loop system converge to a small neighbourhood of zero. Two simulation examples are provided to illustrate the effectiveness of the proposed control approach.

  13. Bilateral Automated Continuous Distraction Osteogenesis in a Design Model: Proof of Principle

    PubMed Central

    Peacock, Zachary S.; Tricomi, Brad J.; Faquin, William C.; Magill, John C.; Murphy, Brian A.; Kaban, Leonard B.; Troulis, Maria J.

    2015-01-01

    The purpose of this study was to demonstrate that automated, continuous, curvilinear distraction osteogenesis (DO) in a minipig model is effective when performed bilaterally, at rates up to 3mm/day, to achieve clinically relevant lengthening. A Yucatan minipig in the mixed dentition phase, underwent bilateral, continuous DO at a rate of 2 mm/day at the center of rotation; 1.0 and 3.0 mm/day at the superior and inferior regions, respectively. The distraction period was 13 days with no latency period. Vector and rate of distraction were remotely monitored without radiographs, using the device sensor. After fixation and euthanasia, the mandible and digastric muscles were harvested. The ex-vivo appearance, stability, and radiodensity of the regenerate were evaluated using a semi-quantitative scale. Percent surface area (PSA) occupied by bone, fibrous tissue, cartilage, and hematoma were calculated using histomorphometrics. The effects of DO on the digastric muscles and mandibular condyles were assessed via microscopy and degenerative changes were quantified. The animal was distracted to 21 mm and 24 mm on the right and left sides, respectively. Clinical appearance, stability, and radiodensity were scored as ‘3’ bilaterally indicating osseous union. The total PSA occupied by bone (right = 75.53±2.19%; left PSA = 73.11±2.18%) approached that of an unoperated mandible (84.67±0.86%). Digastric muscles and condyles showed negligible degenerative or abnormal histologic changes. This proof of principle study is the first report of osseous healing with no ill-effect on associated soft tissue and the mandibular condyle using bilateral, automated, continuous, curvilinear DO at rates up to 3 mm/day. The model approximates potential human application of continuous automated distraction with a semiburied device. PMID:26594967

  14. Adaptive critic autopilot design of bank-to-turn missiles using fuzzy basis function networks.

    PubMed

    Lin, Chuan-Kai

    2005-04-01

    A new adaptive critic autopilot design for bank-to-turn missiles is presented. In this paper, the architecture of adaptive critic learning scheme contains a fuzzy-basis-function-network based associative search element (ASE), which is employed to approximate nonlinear and complex functions of bank-to-turn missiles, and an adaptive critic element (ACE) generating the reinforcement signal to tune the associative search element. In the design of the adaptive critic autopilot, the control law receives signals from a fixed gain controller, an ASE and an adaptive robust element, which can eliminate approximation errors and disturbances. Traditional adaptive critic reinforcement learning is the problem faced by an agent that must learn behavior through trial-and-error interactions with a dynamic environment, however, the proposed tuning algorithm can significantly shorten the learning time by online tuning all parameters of fuzzy basis functions and weights of ASE and ACE. Moreover, the weight updating law derived from the Lyapunov stability theory is capable of guaranteeing both tracking performance and stability. Computer simulation results confirm the effectiveness of the proposed adaptive critic autopilot.

  15. Frequency-based design of Adaptive Optics systems

    NASA Astrophysics Data System (ADS)

    Agapito, Guido; Battistelli, Giorgio; Mari, Daniele; Selvi, Daniela; Tesi, Alberto; Tesi, Pietro

    2013-12-01

    The problem of reducing the effects of wavefront distortion and structural vibrations inground-based telescopes is addressed within a modal-control framework. The proposed approach aimsat optimizing the parameters of a given modal stabilizing controller with respect to a performance criterionwhich reflects the residual phase variance and is defined on a sampled frequency domain. Thisframework makes it possible to account for turbulence and vibration profiles of arbitrary complexity(even empirical power spectral densities from data), while the controller order can be kept at a desiredvalue. Moreover it is possible to take into account additional requirements, as robustness in the presenceof disturbances whose intensity and frequency profile vary with time. The proposed design procedureresults in solving a minmax problem and can be converted into a linear programming problem withquadratic constraints, for which there exist several standard optimization techniques. The optimizationstarts from a given stabilizing controller which can be either a non-model-based controller (in this caseno identification effort is required), or a model-based controller synthesized by means of turbulence andvibration models of limited complexity. In this sense the approach can be viewed not only as alternative,but also as cooperative with other control design approaches. The results obtained by means of anEnd-to-End simulator are shown to emphasize the power of the proposed method.

  16. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    SciTech Connect

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  17. Low Level Waste Conceptual Design Adaption to Poor Geological Conditions

    SciTech Connect

    Bell, J.; Drimmer, D.; Giovannini, A.; Manfroy, P.; Maquet, F.; Schittekat, J.; Van Cotthem, A.; Van Echelpoel, E.

    2002-02-26

    Since the early eighties, several studies have been carried out in Belgium with respect to a repository for the final disposal of low-level radioactive waste (LLW). In 1998, the Belgian Government decided to restrict future investigations to the four existing nuclear sites in Belgium or sites that might show interest. So far, only two existing nuclear sites have been thoroughly investigated from a geological and hydrogeological point of view. These sites are located in the North-East (Mol-Dessel) and in the mid part (Fleurus-Farciennes) of the country. Both sites have the disadvantage of presenting poor geological and hydrogeological conditions, which are rather unfavorable to accommodate a surface disposal facility for LLW. The underground of the Mol-Dessel site consists of neogene sand layers of about 180 m thick which cover a 100 meters thick clay layer. These neogene sands contain, at 20 m depth, a thin clayey layer. The groundwater level is quite close to the surface (0-2m) and finally, the topography is almost totally flat. The upper layer of the Fleurus-Farciennes site consists of 10 m silt with poor geomechanical characteristics, overlying sands (only a few meters thick) and Westphalian shales between 15 and 20 m depth. The Westphalian shales are tectonized and strongly weathered. In the past, coal seams were mined out. This activity induced locally important surface subsidence. For both nuclear sites that were investigated, a conceptual design was made that could allow any unfavorable geological or hydrogeological conditions of the site to be overcome. In Fleurus-Farciennes, for instance, the proposed conceptual design of the repository is quite original. It is composed of a shallow, buried concrete cylinder, surrounded by an accessible concrete ring, which allows permanent inspection and control during the whole lifetime of the repository. Stability and drainage systems should be independent of potential differential settlements an d subsidences

  18. GASICA: generic automated stress induction and control application design of an application for controlling the stress state

    PubMed Central

    van der Vijgh, Benny; Beun, Robbert J.; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms. PMID:25538554

  19. Adaptive Automation Triggered by EEG-Based Mental Workload Index: A Passive Brain-Computer Interface Application in Realistic Air Traffic Control Environment.

    PubMed

    Aricò, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Colosimo, Alfredo; Bonelli, Stefano; Golfetti, Alessia; Pozzi, Simone; Imbert, Jean-Paul; Granger, Géraud; Benhacene, Raïlane; Babiloni, Fabio

    2016-01-01

    Adaptive Automation (AA) is a promising approach to keep the task workload demand within appropriate levels in order to avoid both the under- and over-load conditions, hence enhancing the overall performance and safety of the human-machine system. The main issue on the use of AA is how to trigger the AA solutions without affecting the operative task. In this regard, passive Brain-Computer Interface (pBCI) systems are a good candidate to activate automation, since they are able to gather information about the covert behavior (e.g., mental workload) of a subject by analyzing its neurophysiological signals (i.e., brain activity), and without interfering with the ongoing operational activity. We proposed a pBCI system able to trigger AA solutions integrated in a realistic Air Traffic Management (ATM) research simulator developed and hosted at ENAC (École Nationale de l'Aviation Civile of Toulouse, France). Twelve Air Traffic Controller (ATCO) students have been involved in the experiment and they have been asked to perform ATM scenarios with and without the support of the AA solutions. Results demonstrated the effectiveness of the proposed pBCI system, since it enabled the AA mostly during the high-demanding conditions (i.e., overload situations) inducing a reduction of the mental workload under which the ATCOs were operating. On the contrary, as desired, the AA was not activated when workload level was under the threshold, to prevent too low demanding conditions that could bring the operator's workload level toward potentially dangerous conditions of underload.

  20. Adaptive Automation Triggered by EEG-Based Mental Workload Index: A Passive Brain-Computer Interface Application in Realistic Air Traffic Control Environment

    PubMed Central

    Aricò, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Colosimo, Alfredo; Bonelli, Stefano; Golfetti, Alessia; Pozzi, Simone; Imbert, Jean-Paul; Granger, Géraud; Benhacene, Raïlane; Babiloni, Fabio

    2016-01-01

    Adaptive Automation (AA) is a promising approach to keep the task workload demand within appropriate levels in order to avoid both the under- and over-load conditions, hence enhancing the overall performance and safety of the human-machine system. The main issue on the use of AA is how to trigger the AA solutions without affecting the operative task. In this regard, passive Brain-Computer Interface (pBCI) systems are a good candidate to activate automation, since they are able to gather information about the covert behavior (e.g., mental workload) of a subject by analyzing its neurophysiological signals (i.e., brain activity), and without interfering with the ongoing operational activity. We proposed a pBCI system able to trigger AA solutions integrated in a realistic Air Traffic Management (ATM) research simulator developed and hosted at ENAC (École Nationale de l'Aviation Civile of Toulouse, France). Twelve Air Traffic Controller (ATCO) students have been involved in the experiment and they have been asked to perform ATM scenarios with and without the support of the AA solutions. Results demonstrated the effectiveness of the proposed pBCI system, since it enabled the AA mostly during the high-demanding conditions (i.e., overload situations) inducing a reduction of the mental workload under which the ATCOs were operating. On the contrary, as desired, the AA was not activated when workload level was under the threshold, to prevent too low demanding conditions that could bring the operator's workload level toward potentially dangerous conditions of underload. PMID:27833542

  1. IMPACT OF CANAL DESIGN LIMITATIONS ON WATER DELIVERY OPERATIONS AND AUTOMATION

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Irrigation canals are often designed for water transmission. The design engineer simply ensures that the canal will pass the maximum design discharge. However, irrigation canals frequently operated far below design capacity. Because demands and the distribution of flow at bifurcations (branch points...

  2. Automated ultrareliability models - A review

    NASA Technical Reports Server (NTRS)

    Bridgman, M. S.; Ness, W. G.

    1984-01-01

    Analytic models are required to assess the reliability of systems designed to ultrareliability requirements. This paper reviews the capabilities and limitations of five currently available automated reliability models which are applicable to fault-tolerant flight control systems. 'System' includes sensors, computers, and actuators. A set of review criteria including validation, configuration adaptability, and resource requirements for model evaluation are described. Five models, ARIES, CARE II, CARE III, CARSRA, and CAST, are assessed against the criteria, thereby characterizing their capabilities and limitations. This review should be helpful to potential users of the models.

  3. First-order design of off-axis reflective ophthalmic adaptive optics systems using afocal telescopes.

    PubMed

    Gómez-Vieyra, Armando; Dubra, Alfredo; Malacara-Hernández, Daniel; Williams, David R

    2009-10-12

    Expressions for minimal astigmatism in image and pupil planes in off-axis afocal reflective telescopes formed by pairs of spherical mirrors are presented. These formulae which are derived from the marginal ray fan equation can be used for designing laser cavities, spectrographs and adaptive optics retinal imaging systems. The use, range and validity of these formulae are limited by spherical aberration and coma for small and large angles respectively. This is discussed using examples from adaptive optics retinal imaging systems. The performance of the resulting optical designs are evaluated and compared against the configurations with minimal wavefront RMS, using the defocus-corrected wavefront RMS as a metric.

  4. Design and demonstration of automated data analysis algorithms for ultrasonic inspection of complex composite panels with bonds

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2016-02-01

    To address the data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis (ADA) algorithms have been developed to make calls on indications that satisfy the detection criteria and minimize false calls. The original design followed standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. However, certain complex panels with varying shape, ply drops and the presence of bonds can complicate this interpretation process. In this paper, enhancements to the automated data analysis algorithms are introduced to address these challenges. To estimate the thickness of the part and presence of bonds without prior information, an algorithm tracks potential backwall or bond-line signals, and evaluates a combination of spatial, amplitude, and time-of-flight metrics to identify bonded sections. Once part boundaries, thickness transitions and bonded regions are identified, feature extraction algorithms are applied to multiple sets of through-thickness and backwall C-scan images, for evaluation of both first layer through thickness and layers under bonds. ADA processing results are presented for a variety of complex test specimens with inserted materials and other test discontinuities. Lastly, enhancements to the ADA software interface are presented, which improve the software usability for final data review by the inspectors and support the certification process.

  5. Tools for Designing, Evaluating, and Certifying NextGen Technologies and Procedures: Automation Roles and Responsibilities

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.

    2011-01-01

    Barbara Kanki from NASA Ames Research Center will discuss research that focuses on the collaborations between pilots, air traffic controllers and dispatchers that will change in NextGen systems as automation increases and roles and responsibilities change. The approach taken by this NASA Ames team is to build a collaborative systems assessment template (CSAT) based on detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, policies, procedures and documented practices as augmented by field observations and interviews. The CSAT is developed to aid the assessment of key human factors and performance tradeoffs that result from considering different collaborative arrangements under NextGen system changes. In theory, the CSAT product may be applied to any NextGen application (such as Trajectory Based Operations) with specified ground and aircraft capabilities.

  6. Automated information extraction of key trial design elements from clinical trial publications.

    PubMed

    de Bruijn, Berry; Carini, Simona; Kiritchenko, Svetlana; Martin, Joel; Sim, Ida

    2008-11-06

    Clinical trials are one of the most valuable sources of scientific evidence for improving the practice of medicine. The Trial Bank project aims to improve structured access to trial findings by including formalized trial information into a knowledge base. Manually extracting trial information from published articles is costly, but automated information extraction techniques can assist. The current study highlights a single architecture to extract a wide array of information elements from full-text publications of randomized clinical trials (RCTs). This architecture combines a text classifier with a weak regular expression matcher. We tested this two-stage architecture on 88 RCT reports from 5 leading medical journals, extracting 23 elements of key trial information such as eligibility rules, sample size, intervention, and outcome names. Results prove this to be a promising avenue to help critical appraisers, systematic reviewers, and curators quickly identify key information elements in published RCT articles.

  7. Design of an LVDS to USB3.0 adapter and application

    NASA Astrophysics Data System (ADS)

    Qiu, Xiaohan; Wang, Yu; Zhao, Xin; Chang, Zhen; Zhang, Quan; Tian, Yuze; Zhang, Yunyi; Lin, Fang; Liu, Wenqing

    2016-10-01

    USB 3.0 specification was published in 2008. With the development of technology, USB 3.0 is becoming popular. LVDS(Low Voltage Differential Signaling) to USB 3.0 Adapter connects the communication port of spectrometer device and the USB 3.0 port of a computer, and converts the output of an LVDS spectrometer device data to USB. In order to adapt to the changing and developing of technology, LVDS to USB3.0 Adapter was designed and developed based on LVDS to USB2.0 Adapter. The CYUSB3014, a new generation of USB bus interface chip produced by Cypress and conforming to USB3.0 communication protocol, utilizes GPIF-II (GPIF, general programmable interface) to connect the FPGA and increases effective communication speed to 2Gbps. Therefore, the adapter, based on USB3.0 technology, is able to connect more spectrometers to single computer and provides technical basis for the development of the higher speed industrial camera. This article describes the design and development process of the LVDS to USB3.0 adapter.

  8. Design of a Model Reference Adaptive Controller for an Unmanned Air Vehicle

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Matsutani, Megumi; Annaswamy, Anuradha M.

    2010-01-01

    This paper presents the "Adaptive Control Technology for Safe Flight (ACTS)" architecture, which consists of a non-adaptive controller that provides satisfactory performance under nominal flying conditions, and an adaptive controller that provides robustness under off nominal ones. The design and implementation procedures of both controllers are presented. The aim of these procedures, which encompass both theoretical and practical considerations, is to develop a controller suitable for flight. The ACTS architecture is applied to the Generic Transport Model developed by NASA-Langley Research Center. The GTM is a dynamically scaled test model of a transport aircraft for which a flight-test article and a high-fidelity simulation are available. The nominal controller at the core of the ACTS architecture has a multivariable LQR-PI structure while the adaptive one has a direct, model reference structure. The main control surfaces as well as the throttles are used as control inputs. The inclusion of the latter alleviates the pilot s workload by eliminating the need for cancelling the pitch coupling generated by changes in thrust. Furthermore, the independent usage of the throttles by the adaptive controller enables their use for attitude control. Advantages and potential drawbacks of adaptation are demonstrated by performing high fidelity simulations of a flight-validated controller and of its adaptive augmentation.

  9. Backstepping Design of Adaptive Neural Fault-Tolerant Control for MIMO Nonlinear Systems.

    PubMed

    Gao, Hui; Song, Yongduan; Wen, Changyun

    2016-08-24

    In this paper, an adaptive controller is developed for a class of multi-input and multioutput nonlinear systems with neural networks (NNs) used as a modeling tool. It is shown that all the signals in the closed-loop system with the proposed adaptive neural controller are globally uniformly bounded for any external input in L[₀,∞]. In our control design, the upper bound of the NN modeling error and the gains of external disturbance are characterized by unknown upper bounds, which is more rational to establish the stability in the adaptive NN control. Filter-based modification terms are used in the update laws of unknown parameters to improve the transient performance. Finally, fault-tolerant control is developed to accommodate actuator failure. An illustrative example applying the adaptive controller to control a rigid robot arm shows the validation of the proposed controller.

  10. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    SciTech Connect

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-09-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package`s components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed.

  11. Evaluation of green infrastructure designs using the Automated Geospatial Watershed Assessment Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In arid and semi-arid regions, green infrastructure (GI) designs can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwater, addressi...

  12. 75 FR 8968 - Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ..., statistical, and regulatory aspects of a wide range of adaptive design clinical studies that can be proposed... design clinical trials (i.e., clinical, statistical, regulatory) call for special consideration, when to interact with FDA while planning and conducting adaptive design studies, what information to include in...

  13. Conceptual design for a user-friendly adaptive optics system at Lick Observatory

    SciTech Connect

    Bissinger, H.D.; Olivier, S.; Max, C.

    1996-03-08

    In this paper, we present a conceptual design for a general-purpose adaptive optics system, usable with all Cassegrain facility instruments on the 3 meter Shane telescope at the University of California`s Lick Observatory located on Mt. Hamilton near San Jose, California. The overall design goal for this system is to take the sodium-layer laser guide star adaptive optics technology out of the demonstration stage and to build a user-friendly astronomical tool. The emphasis will be on ease of calibration, improved stability and operational simplicity in order to allow the system to be run routinely by observatory staff. A prototype adaptive optics system and a 20 watt sodium-layer laser guide star system have already been built at Lawrence Livermore National Laboratory for use at Lick Observatory. The design presented in this paper is for a next- generation adaptive optics system that extends the capabilities of the prototype system into the visible with more degrees of freedom. When coupled with a laser guide star system that is upgraded to a power matching the new adaptive optics system, the combined system will produce diffraction-limited images for near-IR cameras. Atmospheric correction at wavelengths of 0.6-1 mm will significantly increase the throughput of the most heavily used facility instrument at Lick, the Kast Spectrograph, and will allow it to operate with smaller slit widths and deeper limiting magnitudes. 8 refs., 2 figs.

  14. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  15. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  16. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  17. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  18. Achievements and challenges in automated parameter, shape and topology optimization for divertor design

    NASA Astrophysics Data System (ADS)

    Baelmans, M.; Blommaert, M.; Dekeyser, W.; Van Oevelen, T.

    2017-03-01

    Plasma edge transport codes play a key role in the design of future divertor concepts. Their long simulation times in combination with a large number of control parameters turn the design into a challenging task. In aerodynamics and structural mechanics, adjoint-based optimization techniques have proven successful to tackle similar design challenges. This paper provides an overview of achievements and remaining challenges with these techniques for complex divertor design. It is shown how these developments pave the way for fast sensitivity analysis and improved design from different perspectives.

  19. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  20. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  1. Design of a Fat-Based Adaptive Visual Servoing for Robots with Time Varying Uncertainties

    NASA Astrophysics Data System (ADS)

    Chien, Ming-Chih; Huang, An-Chyau

    2010-05-01

    Most present adaptive control strategies for visual servoing of robots have assumed that the unknown camera parameters, kinematics, and dynamics of visual servoing system should be linearly parameterized in the regressor matrix form. This is because the limitation of the traditional adaptive design in which the uncertainties should be time-invariant such that all time varying terms in the visual servoing system are collected inside the regressor matrix. However, derivation of the regressor matrix is tedious. In this article, a FAT (function approximation technique) based adaptive controller is designed for visual servo robots without the need for the regressor matrix. A Lyapunov-like analysis is used to justify the closed-loop stability and boundedness of internal signals. Moreover, the upper bounds of tracking errors in the transient state are also derived. Computer simulation results are presented to demonstrate the usefulness of the proposed scheme.

  2. Robust adaptive self-structuring fuzzy control design for nonaffine, nonlinear systems

    NASA Astrophysics Data System (ADS)

    Chen, Pin-Cheng; Wang, Chi-Hsu; Lee, Tsu-Tian

    2011-01-01

    In this article, a robust adaptive self-structuring fuzzy control (RASFC) scheme for the uncertain or ill-defined nonlinear, nonaffine systems is proposed. The RASFC scheme is composed of a robust adaptive controller and a self-structuring fuzzy controller. In the self-structuring fuzzy controller design, a novel self-structuring fuzzy system (SFS) is used to approximate the unknown plant nonlinearity, and the SFS can automatically grow and prune fuzzy rules to realise a compact fuzzy rule base. The robust adaptive controller is designed to achieve an L 2 tracking performance to stabilise the closed-loop system. This L 2 tracking performance can provide a clear expression of tracking error in terms of the sum of lumped uncertainty and external disturbance, which has not been shown in previous works. Finally, five examples are presented to show that the proposed RASFC scheme can achieve favourable tracking performance, yet heavy computational burden is relieved.

  3. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  4. An adaptive two-stage dose-response design method for establishing Proof of Concept

    PubMed Central

    Franchetti, Yoko; Anderson, Stewart J.; Sampson, Allan R.

    2013-01-01

    We propose an adaptive two-stage dose-response design where a pre-specified adaptation rule is used to add and/or drop treatment arms between the stages. We extend the multiple comparison procedures-modeling (MCP-Mod) approach into a two-stage design. In each stage, we use the same set of candidate dose-response models and test for a dose-response relationship or proof of concept (PoC) via model-associated statistics. The stage-wise test results are then combined to establish ‘global’ PoC using a conditional error function. Our simulation studies showed good and more robust power in our design method compared to conventional and fixed designs. PMID:23957520

  5. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  6. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  7. Transient analysis of an adaptive system for optimization of design parameters

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.

    1992-01-01

    Averaging methods are applied to analyzing and optimizing the transient response associated with the direct adaptive control of an oscillatory second-order minimum-phase system. The analytical design methods developed for a second-order plant can be applied with some approximation to a MIMO flexible structure having a single dominant mode.

  8. Development of an Assistance Environment for Tutors Based on a Co-Adaptive Design Approach

    ERIC Educational Resources Information Center

    Lavoue, Elise; George, Sebastien; Prevot, Patrick

    2012-01-01

    In this article, we present a co-adaptive design approach named TE-Cap (Tutoring Experience Capitalisation) that we applied for the development of an assistance environment for tutors. Since tasks assigned to tutors in educational contexts are not well defined, we are developing an environment which responds to needs which are not precisely…

  9. Can Approaches to Research in Art and Design Be Beneficially Adapted for Research into Higher Education?

    ERIC Educational Resources Information Center

    Trowler, Paul

    2013-01-01

    This paper examines the research practices in Art and Design that are distinctively different from those common in research into higher education outside those fields. It considers whether and what benefit could be derived from their adaptation by the latter. The paper also examines the factors that are conducive and obstructive to adaptive…

  10. Direct and Inverse Problems of Item Pool Design for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Belov, Dmitry I.; Armstrong, Ronald D.

    2009-01-01

    The recent literature on computerized adaptive testing (CAT) has developed methods for creating CAT item pools from a large master pool. Each CAT pool is designed as a set of nonoverlapping forms reflecting the skill levels of an assumed population of test takers. This article presents a Monte Carlo method to obtain these CAT pools and discusses…

  11. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  12. Automated divertor target design by adjoint shape sensitivity analysis and a one-shot method

    SciTech Connect

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2014-12-01

    As magnetic confinement fusion progresses towards the development of first reactor-scale devices, computational tokamak divertor design is a topic of high priority. Presently, edge plasma codes are used in a forward approach, where magnetic field and divertor geometry are manually adjusted to meet design requirements. Due to the complex edge plasma flows and large number of design variables, this method is computationally very demanding. On the other hand, efficient optimization-based design strategies have been developed in computational aerodynamics and fluid mechanics. Such an optimization approach to divertor target shape design is elaborated in the present paper. A general formulation of the design problems is given, and conditions characterizing the optimal designs are formulated. Using a continuous adjoint framework, design sensitivities can be computed at a cost of only two edge plasma simulations, independent of the number of design variables. Furthermore, by using a one-shot method the entire optimization problem can be solved at an equivalent cost of only a few forward simulations. The methodology is applied to target shape design for uniform power load, in simplified edge plasma geometry.

  13. Automated detection of fovea in fundus images based on vessel-free zone and adaptive Gaussian template.

    PubMed

    Kao, E-Fong; Lin, Pi-Chen; Chou, Ming-Chung; Jaw, Twei-Shiun; Liu, Gin-Chung

    2014-11-01

    This study developed a computerised method for fovea centre detection in fundus images. In the method, the centre of the optic disc was localised first by the template matching method, the disc-fovea axis (a line connecting the optic disc centre and the fovea) was then determined by searching the vessel-free region, and finally the fovea centre was detected by matching the fovea template around the centre of the axis. Adaptive Gaussian templates were used to localise the centres of the optic disc and fovea for the images with different resolutions. The proposed method was evaluated using three publicly available databases (DIARETDB0, DIARETDB1 and MESSIDOR), which consisted of a total of 1419 fundus images with different resolutions. The proposed method obtained the fovea detection accuracies of 93.1%, 92.1% and 97.8% for the DIARETDB0, DIARETDB1 and MESSIDOR databases, respectively. The overall accuracy of the proposed method was 97.0% in this study.

  14. The User-Assisted Automated Experimental (TEST) Design Program (AED): Version II.

    DTIC Science & Technology

    1983-01-01

    ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK System Development Corporation AREA & WORK UNIT NUMBERS 4134 Linden Avenue Suite 305 62202F, 7184-00-09...pro- cedures and which maximize information return while minimizing the number of observations (tests) required. The overall experimental design...E. Taylor, SDC Colorado Springs, CO, for his work on the Central Composite Design, Mr. Edwin G. Meyer who developed many of the algorithms and

  15. Design of an Automated-Counting System of Cell Micronuclei in Micrographs

    NASA Astrophysics Data System (ADS)

    Lozano, A. V.; Márquez, J. A.; Buenfil, A. E.; Gonsebatt, M. E.

    2004-09-01

    We developed and tested a system for the automatic analysis of cell images in order to identify and count micronuclei configurations in digital images from a microscope. The presence of micronuclei has been used as an indicator of DNA damage, and a fast automated analysis may be suitable for early cancer detection, for example. We describe in this work the image-processing protocol and system comprising: acquisition, color normalization, contrast enhancement, color-background removing, color segmentation, mathematical-morphology filtering and restoration, morphometry, feature extraction and analysis, and counting. Among the morphological features used for discriminating micronuclei configurations, we tested compactness, area ratios and separation of selected features (ideally, a nucleus and one or more micronuclei). Among the mathematical-morphology techniques, we used for instance a modified watershed segmentation algorithm for separating touching features. We present our results on several images, the evaluation and performance of each step, and the main problems we solved by using several techniques from mathematical morphology and color-image processing.

  16. Design and Implementation of an Automated Illuminating, Culturing, and Sampling System for Microbial Optogenetic Applications.

    PubMed

    Stewart, Cameron J; McClean, Megan N

    2017-02-19

    Optogenetic systems utilize genetically-encoded proteins that change conformation in response to specific wavelengths of light to alter cellular processes. There is a need for culturing and measuring systems that incorporate programmed illumination and stimulation of optogenetic systems. We present a protocol for building and using a continuous culturing apparatus to illuminate microbial cells with programmed doses of light, and automatically acquire and analyze images of cells in the effluent. The operation of this apparatus as a chemostat allows the growth rate and the cellular environment to be tightly controlled. The effluent of the continuous cell culture is regularly sampled and the cells are imaged by multi-channel microscopy. The culturing, sampling, imaging, and image analysis are fully automated so that dynamic responses in the fluorescence intensity and cellular morphology of cells sampled from the culture effluent are measured over multiple days without user input. We demonstrate the utility of this culturing apparatus by dynamically inducing protein production in a strain of Saccharomyces cerevisiae engineered with an optogenetic system that activates transcription.

  17. An introduction to the BANNING design automation system for shuttle microelectronic hardware development

    NASA Technical Reports Server (NTRS)

    Mcgrady, W. J.

    1979-01-01

    The BANNING MOS design system is presented. It complements rather than supplant the normal design activities associated with the design and fabrication of low-power digital electronic equipment. BANNING is user-oriented and requires no programming experience to use effectively. It provides the user a simulation capability to aid in his circuit design and it eliminates most of the manual operations involved in the layout and artwork generation of integrated circuits. An example of its operation is given and some additional background reading is provided.

  18. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  19. Design of artificial genetic regulatory networks with multiple delayed adaptive responses*

    NASA Astrophysics Data System (ADS)

    Kaluza, Pablo; Inoue, Masayo

    2016-06-01

    Genetic regulatory networks with adaptive responses are widely studied in biology. Usually, models consisting only of a few nodes have been considered. They present one input receptor for activation and one output node where the adaptive response is computed. In this work, we design genetic regulatory networks with many receptors and many output nodes able to produce delayed adaptive responses. This design is performed by using an evolutionary algorithm of mutations and selections that minimizes an error function defined by the adaptive response in signal shapes. We present several examples of network constructions with a predefined required set of adaptive delayed responses. We show that an output node can have different kinds of responses as a function of the activated receptor. Additionally, complex network structures are presented since processing nodes can be involved in several input-output pathways. Supplementary material in the form of one nets file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-70172-9

  20. Automation and adaptation: Nurses' problem-solving behavior following the implementation of bar coded medication administration technology.

    PubMed

    Holden, Richard J; Rivera-Rodriguez, A Joy; Faye, Héléne; Scanlon, Matthew C; Karsh, Ben-Tzion

    2013-08-01

    The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses' operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA's impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians' work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign.

  1. Automation and adaptation: Nurses’ problem-solving behavior following the implementation of bar coded medication administration technology

    PubMed Central

    Holden, Richard J.; Rivera-Rodriguez, A. Joy; Faye, Héléne; Scanlon, Matthew C.; Karsh, Ben-Tzion

    2012-01-01

    The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses’ operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA’s impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians’ work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign. PMID:24443642

  2. Analysis and design of a high power laser adaptive phased array transmitter

    NASA Technical Reports Server (NTRS)

    Mevers, G. E.; Soohoo, J. F.; Winocur, J.; Massie, N. A.; Southwell, W. H.; Brandewie, R. A.; Hayes, C. L.

    1977-01-01

    The feasibility of delivering substantial quantities of optical power to a satellite in low earth orbit from a ground based high energy laser (HEL) coupled to an adaptive antenna was investigated. Diffraction effects, atmospheric transmission efficiency, adaptive compensation for atmospheric turbulence effects, including the servo bandwidth requirements for this correction, and the adaptive compensation for thermal blooming were examined. To evaluate possible HEL sources, atmospheric investigations were performed for the CO2, (C-12)(O-18)2 isotope, CO and DF wavelengths using output antenna locations of both sea level and mountain top. Results indicate that both excellent atmospheric and adaption efficiency can be obtained for mountain top operation with a micron isotope laser operating at 9.1 um, or a CO laser operating single line (P10) at about 5.0 (C-12)(O-18)2um, which was a close second in the evaluation. Four adaptive power transmitter system concepts were generated and evaluated, based on overall system efficiency, reliability, size and weight, advanced technology requirements and potential cost. A multiple source phased array was selected for detailed conceptual design. The system uses a unique adaption technique of phase locking independent laser oscillators which allows it to be both relatively inexpensive and most reliable with a predicted overall power transfer efficiency of 53%.

  3. Neural network-based adaptive controller design of robotic manipulators with an observer.

    PubMed

    Sun, F; Sun, Z; Woo, P Y

    2001-01-01

    A neural network (NN)-based adaptive controller with an observer is proposed for the trajectory tracking of robotic manipulators with unknown dynamics nonlinearities. It is assumed that the robotic manipulator has only joint angle position measurements. A linear observer is used to estimate the robot joint angle velocity, while NNs are employed to further improve the control performance of the controlled system through approximating the modified robot dynamics function. The adaptive controller for robots with an observer can guarantee the uniform ultimate bounds of the tracking errors and the observer errors as well as the bounds of the NN weights. For performance comparisons, the conventional adaptive algorithm with an observer using linearity in parameters of the robot dynamics is also developed in the same control framework as the NN approach for online approximating unknown nonlinearities of the robot dynamics. Main theoretical results for designing such an observer-based adaptive controller with the NN approach using multilayer NNs with sigmoidal activation functions, as well as with the conventional adaptive approach using linearity in parameters of the robot dynamics are given. The performance comparisons between the NN approach and the conventional adaptation approach with an observer is carried out to show the advantages of the proposed control approaches through simulation studies.

  4. E-ELT M4 adaptive unit final design and construction: a progress report

    NASA Astrophysics Data System (ADS)

    Biasi, Roberto; Manetti, Mauro; Andrighettoni, Mario; Angerer, Gerald; Pescoller, Dietrich; Patauner, Christian; Gallieni, Daniele; Tintori, Matteo; Mantegazza, Marco; Fumi, Pierluigi; Lazzarini, Paolo; Briguglio, Runa; Xompero, Marco; Pariani, Giorgio; Riccardi, Armando; Vernet, Elise; Pettazzi, Lorenzo; Lilley, Paul; Cayrel, Marc

    2016-07-01

    The E-ELT M4 adaptive unit is a fundamental part of the E-ELT: it provides the facility level adaptive optics correction that compensates the wavefront distortion induced by atmospheric turbulence and partially corrects the structural deformations caused by wind. The unit is based on the contactless, voice-coil technology already successfully deployed on several large adaptive mirrors, like the LBT, Magellan and VLT adaptive secondary mirrors. It features a 2.4m diameter flat mirror, controlled by 5316 actuators and divided in six segments. The reference structure is monolithic and the cophasing between the segments is guaranteed by the contactless embedded metrology. The mirror correction commands are usually transferred as modal amplitudes, that are checked by the M4 controller through a smart real-time algorithm that is capable to handle saturation effects. A large hexapod provides the fine positioning of the unit, while a rotational mechanism allows switching between the two Nasmyth foci. The unit has entered the final design and construction phase in July 2015, after an advanced preliminary design. The final design review is planned for fall 2017; thereafter, the unit will enter the construction and test phase. Acceptance in Europe after full optical calibration is planned for 2022, while the delivery to Cerro Armazones will occur in 2023. Even if the fundamental concept has remained unchanged with respect to the other contactless large deformable mirrors, the specific requirements of the E-ELT unit posed new design challenges that required very peculiar solutions. Therefore, a significant part of the design phase has been focused on the validation of the new aspects, based on analysis, numerical simulations and experimental tests. Several experimental tests have been executed on the Demonstration Prototype, which is the 222 actuators prototype developed in the frame of the advanced preliminary design. We present the main project phases, the current design

  5. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI

    NASA Astrophysics Data System (ADS)

    Akamatsu, G.; Ikari, Y.; Ohnishi, A.; Nishida, H.; Aita, K.; Sasaki, M.; Yamamoto, Y.; Sasaki, M.; Senda, M.

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer’s disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain 11C-PiB PET were examined. The 11C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The 11C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize 11C-PiB scans as positive or negative. Significant correlation was observed between the SUVRs

  6. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  7. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-01-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins’ stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard. PMID:27174934

  8. Investigating the intrinsic cleanliness of automated handling designed for EUV mask pod-in-pod systems

    NASA Astrophysics Data System (ADS)

    Brux, O.; van der Walle, P.; van der Donck, J. C. J.; Dress, P.

    2011-11-01

    Extreme Ultraviolet Lithography (EUVL) is the most promising solution for technology nodes 16nm (hp) and below. However, several unique EUV mask challenges must be resolved for a successful launch of the technology into the market. Uncontrolled introduction of particles and/or contamination into the EUV scanner significantly increases the risk for device yield loss and potentially scanner down-time. With the absence of a pellicle to protect the surface of the EUV mask, a zero particle adder regime between final clean and the point-of-exposure is critical for the active areas of the mask. A Dual Pod concept for handling EUV masks had been proposed by the industry as means to minimize the risk of mask contamination during transport and storage. SuSS-HamaTech introduces MaskTrackPro InSync as a fully automated solution for the handling of EUV masks in and out of this Dual Pod System and therefore constitutes an interface between various tools inside the Fab. The intrinsic cleanliness of each individual handling and storage step of the inner shell (EIP) of this Dual Pod and the EUV mask inside the InSync Tool has been investigated to confirm the capability for minimizing the risk of cross-contamination. An Entegris Dual Pod EUV-1000A-A110 has been used for the qualification. The particle detection for the qualification procedure was executed with the TNO's RapidNano Particle Scanner, qualified for particle sizes down to 50nm (PSL equivalent). It has been shown that the target specification of < 2 particles @ 60nm per 25 cycles has been achieved. In case where added particles were measured, the EIP has been identified as a potential root cause for Ni particle generation. Any direct Ni-Al contact has to be avoided to mitigate the risk of material abrasion.

  9. Design and progress toward a multi-conjugate adaptive optics system for distributed aberration correction

    SciTech Connect

    Baker, K; Olivier, S; Tucker, J; Silva, D; Gavel, D; Lim, R; Gratrix, E

    2004-08-17

    This article investigates the use of a multi-conjugate adaptive optics system to improve the field-of-view for the system. The emphasis of this research is to develop techniques to improve the performance of optical systems with applications to horizontal imaging. The design and wave optics simulations of the proposed system are given. Preliminary results from the multi-conjugate adaptive optics system are also presented. The experimental system utilizes a liquid-crystal spatial light modulator and an interferometric wave-front sensor for correction and sensing of the phase aberrations, respectively.

  10. DI2ADEM: an adaptive hypermedia designed to improve access to relevant medical information.

    PubMed

    Pagesy, R; Soula, G; Fieschi, M

    2000-01-01

    The World Wide Web (web) provides the same type of information to widely different users and these users must then find the information suitable for their use in the package offered. The authors present the DI2ADEM project designed to take the user into account and intended to provide this user with appropriate medical information. To do that, DI2ADEM is suggesting an adaptive hypermedia based on the management of a meta-knowledge of the user and a knowledge of the information that can be circulated. An adaptive hypermedia prototype devoted to paediatric oncology was implemented on the intranet network of a university hospital.

  11. Design of adaptive filter amplifier in UV communication based on DSP

    NASA Astrophysics Data System (ADS)

    Lv, Zhaoshun; Wu, Hanping; Li, Junyu

    2016-10-01

    According to the problem of the weak signal at receiving end in UV communication, we design a high gain, continuously adjustable adaptive filter amplifier. Based on proposing overall technical indicators and analyzing its working principle of the signal amplifier, we use chip LMH6629MF and two chips of AD797BN to achieve three-level cascade amplification. And apply hardware of DSP TMS320VC5509A to implement digital filtering. Design and verification by Multisim, Protel 99SE and CCS, the results show that: the amplifier can realize continuously adjustable amplification from 1000 to 10000 times without distortion. Magnification error is <=%4@1000 10000. And equivalent input noise voltage of amplification circuit is <=6 nV/ √Hz @30KHz 45KHz, and realizing function of adaptive filtering. The design provides theoretical reference and technical support for the UV weak signal processing.

  12. Laboratory systems integration: robotics and automation.

    PubMed

    Felder, R A

    1991-01-01

    Robotic technology is going to have a profound impact on the clinical laboratory of the future. Faced with increased pressure to reduce health care spending yet increase services to patients, many laboratories are looking for alternatives to the inflexible or "fixed" automation found in many clinical analyzers. Robots are being examined by many clinical pathologists as an attractive technology which can adapt to the constant changes in laboratory testing. Already, laboratory designs are being altered to accommodate robotics and automated specimen processors. However, the use of robotics and computer intelligence in the clinical laboratory is still in its infancy. Successful examples of robotic automation exist in several laboratories. Investigators have used robots to automate endocrine testing, high performance liquid chromatography, and specimen transportation. Large commercial laboratories are investigating the use of specimen processors which combine the use of fixed automation and robotics. Robotics have also reduced the exposure of medical technologists to specimens infected with viral pathogens. The successful examples of clinical robotics applications were a result of the cooperation of clinical chemists, engineers, and medical technologists. At the University of Virginia we have designed and implemented a robotic critical care laboratory. Initial clinical experience suggests that robotic performance is reliable, however, staff acceptance and utilization requires continuing education. We are also developing a robotic cyclosporine which promises to greatly reduce the labor costs of this analysis. The future will bring lab wide automation that will fully integrate computer artificial intelligence and robotics. Specimens will be transported by mobile robots. Specimen processing, aliquotting, and scheduling will be automated.(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  14. Design and Inference for the Intent to Treat Principle using Adaptive Treatment Strategies and Sequential Randomization

    PubMed Central

    Dawson, Ree; Lavori, Philip W.

    2015-01-01

    Nonadherence to assigned treatment jeopardizes the power and interpretability of intent-to-treat comparisons from clinical trial data and continues to be an issue for effectiveness studies, despite their pragmatic emphasis. We posit that new approaches to design need to complement developments in methods for causal inference to address nonadherence, in both experimental and practice settings. This paper considers the conventional study design for psychiatric research and other medical contexts, in which subjects are randomized to treatments that are fixed throughout the trial and presents an alternative that converts the fixed treatments into an adaptive intervention that reflects best practice. The key element is the introduction of an adaptive decision point midway into the study to address a patient's reluctance to remain on treatment before completing a full-length trial of medication. The clinical uncertainty about the appropriate adaptation prompts a second randomization at the new decision point to evaluate relevant options. Additionally, the standard ‘all-or-none’ principal stratification (PS) framework is applied to the first stage of the design to address treatment discontinuation that occurs too early for a mid-trial adaptation. Drawing upon the adaptive intervention features, we develop assumptions to identify the PS causal estimand and introduce restrictions on outcome distributions to simplify Expectation-Maximization calculations. We evaluate the performance of the PS setup, with particular attention to the role played by a binary covariate. The results emphasize the importance of collecting covariate data for use in design and analysis. We consider the generality of our approach beyond the setting of psychiatric research. PMID:25581413

  15. Evaluation of Green Infrastructure Designs Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  16. Green Infrastructure Design Evaluation Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  17. Automated design and optimization of flexible booster autopilots via linear programming. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Hauser, F. D.; Szollosi, G. D.; Lakin, W. S.

    1972-01-01

    COEBRA, the Computerized Optimization of Elastic Booster Autopilots, is an autopilot design program. The bulk of the design criteria is presented in the form of minimum allowed gain/phase stability margins. COEBRA has two optimization phases: (1) a phase to maximize stability margins; and (2) a phase to optimize structural bending moment load relief capability in the presence of minimum requirements on gain/phase stability margins.

  18. Enhanced Aircraft Design Capability for the Automated Structural Optimization System. Phase 1.

    DTIC Science & Technology

    1996-01-31

    creation and evolution . The creation takes place in the GOA in the form of a finite number of designs randomly generated to form the initial population...are feasible or not. Evolution is then applied to the population to produce a new population of, hopefully, better designs. The evolution B-3 of a...chromosomal" diploid strings that are closer, in structure, to human codings than traditional GOA haploid strings. For example, the human code carries 23 pairs

  19. Design Framework for an Adaptive MOOC Enhanced by Blended Learning: Supplementary Training and Personalized Learning for Teacher Professional Development

    ERIC Educational Resources Information Center

    Gynther, Karsten

    2016-01-01

    The research project has developed a design framework for an adaptive MOOC that complements the MOOC format with blended learning. The design framework consists of a design model and a series of learning design principles which can be used to design in-service courses for teacher professional development. The framework has been evaluated by…

  20. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.