Science.gov

Sample records for adaptive automation design

  1. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  2. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  3. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  4. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  5. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  6. Designing Automated Adaptive Support to Improve Student Helping Behaviors in a Peer Tutoring Activity

    ERIC Educational Resources Information Center

    Walker, Erin; Rummel, Nikol; Koedinger, Kenneth R.

    2011-01-01

    Adaptive collaborative learning support systems analyze student collaboration as it occurs and provide targeted assistance to the collaborators. Too little is known about how to design adaptive support to have a positive effect on interaction and learning. We investigated this problem in a reciprocal peer tutoring scenario, where two students take…

  7. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  8. Effects of adaptive task allocation on monitoring of automated systems

    NASA Technical Reports Server (NTRS)

    Parasuraman, R.; Mouloua, M.; Molloy, R.

    1996-01-01

    The effects of adaptive task allocation on monitoring for automation failure during multitask flight simulation were examined. Participants monitored an automated engine status task while simultaneously performing tracking and fuel management tasks over three 30-min sessions. Two methods of adaptive task allocation, both involving temporary return of the automated engine status task to the human operator ("human control"), were examined as a possible countermeasure to monitoring inefficiency. For the model-based adaptive group, the engine status task was allocated to all participants in the middle of the second session for 10 min, following which it was again returned to automation control. The same occurred for the performance-based adaptive group, but only if an individual participant's monitoring performance up to that point did not meet a specified criterion. For the nonadaptive control groups, the engine status task remained automated throughout the experiment. All groups had low probabilities of detection of automation failures for the first 40 min spent with automation. However, following the 10-min intervening period of human control, both adaptive groups detected significantly more automation failures during the subsequent blocks under automation control. The results show that adaptive task allocation can enhance monitoring of automated systems. Both model-based and performance-based allocation improved monitoring of automation. Implications for the design of automated systems are discussed.

  9. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  10. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  11. Adaptive function allocation reduces performance costs of static automation

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian

    1993-01-01

    Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.

  12. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  13. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  14. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  15. Design automation for integrated optics

    NASA Astrophysics Data System (ADS)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  16. Intelligent design system for design automation

    NASA Astrophysics Data System (ADS)

    Shakeri, Cirrus; Deif, Ismail; Katragadda, Prasanna; Knutson, Stanley

    2000-10-01

    In order to succeed in today's global, competitive market, companies need continuous improvements in their product development processes. These improvements should result in expending fewer resources on the design process while achieving better quality. Automating the design process reduces resources needed and allows designers to spend more time on creative aspects that improve the quality of design. For the last three decades, engineers and designers have been searching for better ways to automate the product development process. For certain classes of design problems, which cover a large portion of real world design situations, the process can be automated using knowledge-based systems. These are design problems in which the knowledge sources are known in advance. Using techniques from Knowledge-Based Engineering, knowledge is codified and inserted into a knowledge-based system. The system activates the design knowledge, automatically generating designs that satisfy the design constraints. To increase the return on investment of building automated design systems, Knowledge management methodologies and techniques are required for capturing, formalizing, storing, and searching design knowledge.

  17. Adaptation as organism design

    PubMed Central

    Gardner, Andy

    2009-01-01

    The problem of adaptation is to explain the apparent design of organisms. Darwin solved this problem with the theory of natural selection. However, population geneticists, whose responsibility it is to formalize evolutionary theory, have long neglected the link between natural selection and organismal design. Here, I review the major historical developments in theory of organismal adaptation, clarifying what adaptation is and what it is not, and I point out future avenues for research. PMID:19793739

  18. Automated Core Design

    SciTech Connect

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-07-15

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process.

  19. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  20. Adaptive control of surface finish in automated turning processes

    NASA Astrophysics Data System (ADS)

    García-Plaza, E.; Núñez, P. J.; Martín, A. R.; Sanz, A.

    2012-04-01

    The primary aim of this study was to design and develop an on-line control system of finished surfaces in automated machining process by CNC turning. The control system consisted of two basic phases: during the first phase, surface roughness was monitored through cutting force signals; the second phase involved a closed-loop adaptive control system based on data obtained during the monitoring of the cutting process. The system ensures that surfaces roughness is maintained at optimum values by adjusting the feed rate through communication with the PLC of the CNC machine. A monitoring and adaptive control system has been developed that enables the real-time monitoring of surface roughness during CNC turning operations. The system detects and prevents faults in automated turning processes, and applies corrective measures during the cutting process that raise quality and reliability reducing the need for quality control.

  1. Parallel automated adaptive procedures for unstructured meshes

    NASA Technical Reports Server (NTRS)

    Shephard, M. S.; Flaherty, J. E.; Decougny, H. L.; Ozturan, C.; Bottasso, C. L.; Beall, M. W.

    1995-01-01

    Consideration is given to the techniques required to support adaptive analysis of automatically generated unstructured meshes on distributed memory MIMD parallel computers. The key areas of new development are focused on the support of effective parallel computations when the structure of the numerical discretization, the mesh, is evolving, and in fact constructed, during the computation. All the procedures presented operate in parallel on already distributed mesh information. Starting from a mesh definition in terms of a topological hierarchy, techniques to support the distribution, redistribution and communication among the mesh entities over the processors is given, and algorithms to dynamically balance processor workload based on the migration of mesh entities are given. A procedure to automatically generate meshes in parallel, starting from CAD geometric models, is given. Parallel procedures to enrich the mesh through local mesh modifications are also given. Finally, the combination of these techniques to produce a parallel automated finite element analysis procedure for rotorcraft aerodynamics calculations is discussed and demonstrated.

  2. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. PMID:26360226

  3. Automated design of flexible linkers.

    PubMed

    Manion, Charles; Arlitt, Ryan; Campbell, Matthew I; Tumer, Irem; Stone, Rob; Greaney, P Alex

    2016-03-14

    This paper presents a method for the systematic and automated design of flexible organic linkers for construction of metal organic-frameworks (MOFs) in which flexibility, compliance, or other mechanically exotic properties originate at the linker level rather than from the framework kinematics. Our method couples a graph grammar method for systematically generating linker like molecules with molecular dynamics modeling of linkers' mechanical response. Using this approach we have generated a candidate pool of >59,000 hypothetical linkers. We screen linker candidates according to their mechanical behaviors under large deformation, and extract fragments common to the most performant candidate materials. To demonstrate the general approach to MOF design we apply our system to designing linkers for pressure switching MOFs-MOFs that undergo reversible structural collapse after a stress threshold is exceeded. PMID:26687337

  4. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  5. Adaptive Sampling Designs.

    ERIC Educational Resources Information Center

    Flournoy, Nancy

    Designs for sequential sampling procedures that adapt to cumulative information are discussed. A familiar illustration is the play-the-winner rule in which there are two treatments; after a random start, the same treatment is continued as long as each successive subject registers a success. When a failure occurs, the other treatment is used until…

  6. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  7. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    ERIC Educational Resources Information Center

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  8. Rebound: A Framework for Automated Component Adaptation

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    The REBOUND adaptation framework organizes a collection of adaptation tactics in a way that they can be selected based on the components available for adaptation. Adaptation tactics are specified formally in terms of the relationship between the component to be adapted and the resulting adapted component. The tactic specifications are used as matching conditions for specification-based component retrieval, creating a 'retrieval for adaptation' scenario. The results of specification matching are used to guide component adaptation. Several examples illustrate how the framework guides component and tactic selection and how basic tactics are composed to form more powerful tactics.

  9. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions. PMID:21846313

  10. INITIATORS AND TRIGGERING CONDITIONS FOR ADAPTIVE AUTOMATION IN ADVANCED SMALL MODULAR REACTORS

    SciTech Connect

    Katya L Le Blanc; Johanna h Oxstrand

    2014-04-01

    It is anticipated that Advanced Small Modular Reactors (AdvSMRs) will employ high degrees of automation. High levels of automation can enhance system performance, but often at the cost of reduced human performance. Automation can lead to human out-of the loop issues, unbalanced workload, complacency, and other problems if it is not designed properly. Researchers have proposed adaptive automation (defined as dynamic or flexible allocation of functions) as a way to get the benefits of higher levels of automation without the human performance costs. Adaptive automation has the potential to balance operator workload and enhance operator situation awareness by allocating functions to the operators in a way that is sensitive to overall workload and capabilities at the time of operation. However, there still a number of questions regarding how to effectively design adaptive automation to achieve that potential. One of those questions is related to how to initiate (or trigger) a shift in automation in order to provide maximal sensitivity to operator needs without introducing undesirable consequences (such as unpredictable mode changes). Several triggering mechanisms for shifts in adaptive automation have been proposed including: operator initiated, critical events, performance-based, physiological measurement, model-based, and hybrid methods. As part of a larger project to develop design guidance for human-automation collaboration in AdvSMRs, researchers at Idaho National Laboratory have investigated the effectiveness and applicability of each of these triggering mechanisms in the context of AdvSMR. Researchers reviewed the empirical literature on adaptive automation and assessed each triggering mechanism based on the human-system performance consequences of employing that mechanism. Researchers also assessed the practicality and feasibility of using the mechanism in the context of an AdvSMR control room. Results indicate that there are tradeoffs associated with each

  11. Computer automation for feedback system design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.

  12. Automated Hardware Design via Evolutionary Search

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.

    2000-01-01

    The goal of this research is to investigate the application of evolutionary search to the process of automated engineering design. Evolutionary search techniques involve the simulation of Darwinian mechanisms by computer algorithms. In recent years, such techniques have attracted much attention because they are able to tackle a wide variety of difficult problems and frequently produce acceptable solutions. The results obtained are usually functional, often surprising, and typically "messy" because the algorithms are told to concentrate on the overriding objective and not elegance or simplicity. advantages. First, faster design cycles translate into time and, hence, cost savings. Second, automated design techniques can be made to scale well and hence better deal with increasing amounts of design complexity. Third, design quality can increase because design properties can be specified a priori. For example, size and weight specifications of a device, smaller and lighter than the best known design, might be optimized by the automated design technique. The domain of electronic circuit design is an advantageous platform in which to study automated design techniques because it is a rich design space that is well understood, permitting human-created designs to be compared to machine- generated designs. developed for circuit design was to automatically produce high-level integrated electronic circuit designs whose properties permit physical implementation in silicon. This process entailed designing an effective evolutionary algorithm and solving a difficult multiobjective optimization problem. FY 99 saw many accomplishments in this effort.

  13. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  14. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  15. Automated CAD design for sculptured airfoil surfaces

    NASA Astrophysics Data System (ADS)

    Murphy, S. D.; Yeagley, S. R.

    1990-11-01

    The design of tightly tolerated sculptured surfaces such as those for airfoils requires a significant design effort in order to machine the tools to create these surfaces. Because of the quantity of numerical data required to describe the airfoil surfaces, a CAD approach is required. Although this approach will result in productivity gains, much larger gains can be achieved by automating the design process. This paper discusses an application which resulted in an eightfold improvement in productivity by automating the design process on the CAD system.

  16. Design considerations for automated packaging operations

    SciTech Connect

    Fahrenholtz, J.; Jones, J.; Kincy, M.

    1993-12-31

    The paper is based on work performed at Sandia National Laboratories to automate DOE packaging operations. It is a general summary of work from several projects which may be applicable to other packaging operations. Examples are provided of robotic operations which have been demonstrated as well as operations that are currently being developed. General design considerations for packages and for automated handling systems are described.

  17. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  18. Automated adaptive inference of phenomenological dynamical models

    NASA Astrophysics Data System (ADS)

    Daniels, Bryan C.; Nemenman, Ilya

    2015-08-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  19. An Automated Approach to Instructional Design Guidance.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…

  20. Automated database design technology and tools

    NASA Technical Reports Server (NTRS)

    Shen, Stewart N. T.

    1988-01-01

    The Automated Database Design Technology and Tools research project results are summarized in this final report. Comments on the state of the art in various aspects of database design are provided, and recommendations made for further research for SNAP and NAVMASSO future database applications.

  1. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  2. Automation of assertion testing - Grid and adaptive techniques

    NASA Technical Reports Server (NTRS)

    Andrews, D. M.

    1985-01-01

    Assertions can be used to automate the process of testing software. Two methods for automating the generation of input test data are described in this paper. One method selects the input values of variables at regular intervals in a 'grid'. The other, adaptive testing, uses assertion violations as a measure of errors detected and generates new test cases based on test results. The important features of assertion testing are that: it can be used throughout the entire testing cycle; it provides automatic notification of error conditions; and it can be used with automatic input generation techniques which eliminate the subjectivity in choosing test data.

  3. Automated solar collector installation design

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-08-26

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives.

  4. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  5. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, C.; Gray, G.

    1998-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations.

  6. Design of Inhouse Automated Library Systems.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1984-01-01

    Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…

  7. Automated Tract Extraction via Atlas Based Adaptive Clustering

    PubMed Central

    Tunç, Birkan; Parker, William A.; Ingalhalikar, Madhura; Verma, Ragini

    2014-01-01

    Advancements in imaging protocols such as the high angular resolution diffusion-weighted imaging (HARDI) and in tractography techniques are expected to cause an increase in the tract-based analyses. Statistical analyses over white matter tracts can contribute greatly towards understanding structural mechanisms of the brain since tracts are representative of the connectivity pathways. The main challenge with tract-based studies is the extraction of the tracts of interest in a consistent and comparable manner over a large group of individuals without drawing the inclusion and exclusion regions of interest. In this work, we design a framework for automated extraction of white matter tracts. The framework introduces three main components, namely a connectivity based fiber representation, a fiber clustering atlas, and a clustering approach called Adaptive Clustering. The fiber representation relies on the connectivity signatures of fibers to establish an easy correspondence between different subjects. A group-wise clustering of these fibers that are represented by the connectivity signatures is then used to generate a fiber bundle atlas. Finally, Adaptive Clustering incorporates the previously generated clustering atlas as a prior, to cluster the fibers of a new subject automatically. Experiments on the HARDI scans of healthy individuals acquired repeatedly, demonstrate the applicability, the reliability and the repeatability of our approach in extracting white matter tracts. By alleviating the seed region selection or the inclusion/exclusion ROI drawing requirements that are usually handled by trained radiologists, the proposed framework expands the range of possible clinical applications and establishes the ability to perform tract-based analyses with large samples. PMID:25134977

  8. Banning design automation software implementation

    NASA Technical Reports Server (NTRS)

    Kuehlthau, R. L.

    1975-01-01

    The research is reported for developing a system of computer programs to aid engineering in the design, fabrication, and testing of large scale integrated circuits, hybrid circuits, and printed circuit boards. The automatic layout programs, analysis programs, and interface programs are discussed.

  9. Automated mixed traffic vehicle design AMTV 2

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Marks, R. A.; Cassell, P. L.

    1982-01-01

    The design of an improved and enclosed Automated Mixed Traffic Transit (AMTT) vehicle is described. AMTT is an innovative concept for low-speed tram-type transit in which suitable vehicles are equipped with sensors and controls to permit them to operate in an automated mode on existing road or walkway surfaces. The vehicle chassis and body design are presented in terms of sketches and photographs. The functional design of the sensing and control system is presented, and modifications which could be made to the baseline design for improved performance, in particular to incorporate a 20-mph capability, are also discussed. The vehicle system is described at the block-diagram-level of detail. Specifications and parameter values are given where available.

  10. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation. PMID:24395369

  11. Psychophysiological Control of Acognitive Task Using Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Freeman, Frederick; Pope, Alan T. (Technical Monitor)

    2001-01-01

    The major focus of the present proposal was to examine psychophysiological variables related to hazardous states of awareness induced by monitoring automated systems. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While both performance based and model based adaptive automation have been studied, the use of psychophysiological measures, especially EEG, offers the advantage of real time evaluation of the state of the subject. The current study used the closed-loop system, developed at NASA-Langley Research Center, to control the state of awareness of subjects while they performed a cognitive vigilance task. Previous research in our laboratory, supported by NASA, has demonstrated that, in an adaptive automation, closed-loop environment, subjects perform a tracking task better under a negative than a positive, feedback condition. In addition, this condition produces less subjective workload and larger P300 event related potentials to auditory stimuli presented in a concurrent oddball task. We have also recently shown that the closed-loop system used to control the level of automation in a tracking task can also be used to control the event rate of stimuli in a vigilance monitoring task. By changing the event rate based on the subject's index of arousal, we have been able to produce improved monitoring, relative to various control groups. We have demonstrated in our initial closed-loop experiments with the the vigilance paradigm that using a negative feedback contingency (i.e. increasing event rates when the EEG index is low and decreasing event rates when

  12. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  13. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  14. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design. PMID:23651006

  15. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  16. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  17. LBT adaptive secondary preliminary design

    NASA Astrophysics Data System (ADS)

    Gallieni, Daniele; Del Vecchio, Ciro; Anaclerio, E.; Lazzarini, P. G.

    2000-07-01

    We report on the design of the two Gregorian adaptive secondary mirrors of the Large Binocular Telescope. Each adaptive secondary is a Zerodur shell having an external diameter of 911 mm and a thickness of about 1.5 mm. The deformable mirror is controlled by a pattern of 918 electromagnetic actuators. Its shape is referred to a stable ULE back plate by means of capacitive sensors co-located to the actuators pattern. The preliminary design of the system is addressed with particular attention to the reference plate optimization.

  18. Design of the hybrid automated reliability predictor

    NASA Technical Reports Server (NTRS)

    Geist, R.; Trivedi, K.; Dugan, J. B.; Smotherman, M.

    1983-01-01

    The design of the Hybrid Automated Reliability Predictor (HARP), now under development at Duke University, is presented. The HARP approach to reliability prediction is characterized by a decomposition of the overall model into fault-occurrence and fault-handling sub-models. The fault-occurrence model is a non-homogeneous Markov chain which is solved analytically, while the fault-handling model is a Petri Net which is simulated. HARP provides automated analysis of sensitivity to uncertainties in the input parameters and in the initial state specifications. It then produces a predicted reliability band as a function of mission time, as well as estimates of the improvement (narrowing of the band) to be gained by a specified amount of reduction in uncertainty.

  19. Automated Antenna Design with Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Linden, Derek; Hornby, Greg; Lohn, Jason; Globus, Al; Krishunkumor, K.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to

  20. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  1. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  2. Automated Procedure for Roll Pass Design

    NASA Astrophysics Data System (ADS)

    Lambiase, F.; Langella, A.

    2009-04-01

    The aim of this work has been to develop an automatic roll pass design method, capable of minimizing the number of roll passes. The adoption of artificial intelligence technologies, particularly expert systems, and a hybrid model for the surface profile evaluation of rolled bars, has allowed us to model the search for the minimal sequence with a tree path search. This approach permitted a geometrical optimization of roll passes while allowing automation of the roll pass design process. Moreover, the heuristic nature of the inferential engine contributes a great deal toward reducing search time, thus allowing such a system to be employed for industrial purposes. Finally, this new approach was compared with other recently developed automatic systems to validate and measure possible improvements among them.

  3. Automating analog design: Taming the shrew

    NASA Technical Reports Server (NTRS)

    Barlow, A.

    1990-01-01

    The pace of progress in the design of integrated circuits continues to amaze observers inside and outside of the industry. Three decades ago, a 50 transistor chip was a technological wonder. Fifteen year later, a 5000 transistor device would 'wow' the crowds. Today, 50,000 transistor chips will earn a 'not too bad' assessment, but it takes 500,000 to really leave an impression. In 1975 a typical ASIC device had 1000 transistors, took one year to first samples (and two years to production) and sold for about 5 cents per transistor. Today's 50,000 transistor gate array takes about 4 months from spec to silicon, works the first time, and sells for about 0.02 cents per transistor. Fifteen years ago, the single most laborious and error prone step in IC design was the physical layout. Today, most IC's never see the hand of a layout designer: and automatic place and route tool converts the engineer's computer captured schematic to a complete physical design using a gate array or a library of standard cells also created by software rather than by designers. CAD has also been a generous benefactor to the digital design process. The architect of today's digital systems creates the design using an RTL or other high level simulator. Then the designer pushes a button to invoke the logic synthesizer-optimizer tool. A fault analyzer checks the result for testability and suggests where scan based cells will improve test coverage. One obstinate holdout amidst this parade of progress is the automation of analog design and its reduction to semi-custom techniques. This paper investigates the application of CAD techniques to analog design.

  4. Automated Design Space Exploration with Aspen

    DOE PAGESBeta

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  5. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  6. FRACSAT: Automated design synthesis for future space architectures

    NASA Astrophysics Data System (ADS)

    Mackey, R.; Uckun, S.; Do, Minh; Shah, J.

    This paper describes the algorithmic basis and development of FRACSAT (FRACtionated Spacecraft Architecture Toolkit), a new approach to conceptual design, cost-benefit analysis, and detailed trade studies for space systems. It provides an automated capability for exploration of candidate spacecraft architectures, leading users to near-optimal solutions with respect to user-defined requirements, risks, and program uncertainties. FRACSAT utilizes a sophisticated planning algorithm (PlanVisioner) to perform a quasi-exhaustive search for candidate architectures, constructing candidates from an extensible model-based representation of space system components and functions. These candidates are then evaluated with emphasis on the business case, computing the expected design utility and system costs as well as risk, presenting the user with a greatly reduced selection of candidates. The user may further refine the search according to cost or benefit uncertainty, adaptability, or other performance metrics as needed.

  7. Automated design tools for biophotonic systems

    NASA Astrophysics Data System (ADS)

    Vacca, Giacomo; Lehtimäki, Hannu; Karras, Tapio; Murphy, Sean

    2014-03-01

    Traditional design methods for flow cytometers and other complex biophotonic systems are increasingly recognized as a major bottleneck in instrumentation development. The many manual steps involved in the analysis and translation of the design, from optical layout to a detailed mechanical model and ultimately to a fully functional instrument, are laborintensive and prone to wasteful trial-and-error iterations. We have developed two complementary, linked technologies that address this problem: one design tool (LiveIdeas™) provides an intuitive environment for interactive, real-time simulations of system-level performance; the other tool (BeamWise™) automates the generation of mechanical 3D CAD models based on those simulations. The strength of our approach lies in a parametric modeling strategy that breaks boundaries between engineering subsystems (e.g., optics and fluidics) to predict critical behavior of the instrument as a whole. The results: 70 percent reduction in early-stage project effort, significantly enhancing the probability of success by virtue of a more efficient exploration of the design space.

  8. Automated mechanical ventilation: adapting decision making to different disease states.

    PubMed

    Lozano-Zahonero, S; Gottlieb, D; Haberthür, C; Guttmann, J; Möller, K

    2011-03-01

    The purpose of the present study is to introduce a novel methodology for adapting and upgrading decision-making strategies concerning mechanical ventilation with respect to different disease states into our fuzzy-based expert system, AUTOPILOT-BT. The special features are: (1) Extraction of clinical knowledge in analogy to the daily routine. (2) An automated process to obtain the required information and to create fuzzy sets. (3) The controller employs the derived fuzzy rules to achieve the desired ventilation status. For demonstration this study focuses exclusively on the control of arterial CO(2) partial pressure (p(a)CO(2)). Clinical knowledge from 61 anesthesiologists was acquired using a questionnaire from which different disease-specific fuzzy sets were generated to control p(a)CO(2). For both, patients with healthy lung and with acute respiratory distress syndrome (ARDS) the fuzzy sets show different shapes. The fuzzy set "normal", i.e., "target p(a)CO(2) area", ranges from 35 to 39 mmHg for healthy lungs and from 39 to 43 mmHg for ARDS lungs. With the new fuzzy sets our AUTOPILOT-BT reaches the target p(a)CO(2) within maximal three consecutive changes of ventilator settings. Thus, clinical knowledge can be extended, updated, and the resulting mechanical ventilation therapies can be individually adapted, analyzed, and evaluated. PMID:21069471

  9. Adaptive clinical trial designs in oncology

    PubMed Central

    Zang, Yong; Lee, J. Jack

    2015-01-01

    Adaptive designs have become popular in clinical trial and drug development. Unlike traditional trial designs, adaptive designs use accumulating data to modify the ongoing trial without undermining the integrity and validity of the trial. As a result, adaptive designs provide a flexible and effective way to conduct clinical trials. The designs have potential advantages of improving the study power, reducing sample size and total cost, treating more patients with more effective treatments, identifying efficacious drugs for specific subgroups of patients based on their biomarker profiles, and shortening the time for drug development. In this article, we review adaptive designs commonly used in clinical trials and investigate several aspects of the designs, including the dose-finding scheme, interim analysis, adaptive randomization, biomarker-guided randomization, and seamless designs. For illustration, we provide examples of real trials conducted with adaptive designs. We also discuss practical issues from the perspective of using adaptive designs in oncology trials. PMID:25811018

  10. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  11. Rapid iterative reanalysis for automated design

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.

    1973-01-01

    A method for iterative reanalysis in automated structural design is presented for a finite-element analysis using the direct stiffness approach. A basic feature of the method is that the generalized stiffness and inertia matrices are expressed as functions of structural design parameters, and these generalized matrices are expanded in Taylor series about the initial design. Only the linear terms are retained in the expansions. The method is approximate because it uses static condensation, modal reduction, and the linear Taylor series expansions. The exact linear representation of the expansions of the generalized matrices is also described and a basis for the present method is established. Results of applications of the present method to the recalculation of the natural frequencies of two simple platelike structural models are presented and compared with results obtained by using a commonly applied analysis procedure used as a reference. In general, the results are in good agreement. A comparison of the computer times required for the use of the present method and the reference method indicated that the present method required substantially less time for reanalysis. Although the results presented are for relatively small-order problems, the present method will become more efficient relative to the reference method as the problem size increases. An extension of the present method to static reanalysis is described, ana a basis for unifying the static and dynamic reanalysis procedures is presented.

  12. ERIS adaptive optics system design

    NASA Astrophysics Data System (ADS)

    Marchetti, Enrico; Le Louarn, Miska; Soenke, Christian; Fedrigo, Enrico; Madec, Pierre-Yves; Hubin, Norbert

    2012-07-01

    The Enhanced Resolution Imager and Spectrograph (ERIS) is the next-generation instrument planned for the Very Large Telescope (VLT) and the Adaptive Optics facility (AOF). It is an AO assisted instrument that will make use of the Deformable Secondary Mirror and the new Laser Guide Star Facility (4LGSF), and it is planned for the Cassegrain focus of the telescope UT4. The project is currently in its Phase A awaiting for approval to continue to the next phases. The Adaptive Optics system of ERIS will include two wavefront sensors (WFS) to maximize the coverage of the proposed sciences cases. The first is a high order 40x40 Pyramid WFS (PWFS) for on axis Natural Guide Star (NGS) observations. The second is a high order 40x40 Shack-Hartmann WFS for single Laser Guide Stars (LGS) observations. The PWFS, with appropriate sub-aperture binning, will serve also as low order NGS WFS in support to the LGS mode with a field of view patrolling capability of 2 arcmin diameter. Both WFSs will be equipped with the very low read-out noise CCD220 based camera developed for the AOF. The real-time reconstruction and control is provided by a SPARTA real-time platform adapted to support both WFS modes. In this paper we will present the ERIS AO system in all its main aspects: opto-mechanical design, real-time computer design, control and calibrations strategy. Particular emphasis will be given to the system performance obtained via dedicated numerical simulations.

  13. A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation

    NASA Technical Reports Server (NTRS)

    Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.

    2004-01-01

    Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed

  14. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  15. DASLL. Printed Circuit Board Design Automation

    SciTech Connect

    Magnuson, W.G.Jr.; Willett, G.W.

    1983-06-03

    DASLL (Design Automation System at Lawrence Livermore) is a set of computer programs for printed circuit board (PCB) layout. The DASLL system can process a number of PCB trimlines, including: DEC 1, 2, 4, and 6 high configurations, CLI, Augat, Varian, and several rectangular geometries; others can be added. Over 800 components and generic package types are available in DASLLDB, the system reference library. Two-layer boards with non-gridded (structured) power and ground busses are supported, and PCB densities of approximately 1.2 square inches per equivalent IC (or less dense) are best accommodated by DASLL. The system has been used to make etch artwork and drill tapes (starting with a schematic drawing) for a six IC CLI board in less than two working days. Initial processing will produce reports and computer printer-plots which can be used to verify the input. Final output can include silkscreen photo-artwork, PCB etch photo-artwork, punched paper tapes for the SLO-SYN and Pratt-Whitney N/C drill machines, and computer listings of signal strings, parts lists, etc.

  16. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  17. Achieving runtime adaptability through automated model evolution and variant selection

    NASA Astrophysics Data System (ADS)

    Mosincat, Adina; Binder, Walter; Jazayeri, Mehdi

    2014-01-01

    Dynamically adaptive systems propose adaptation by means of variants that are specified in the system model at design time and allow for a fixed set of different runtime configurations. However, in a dynamic environment, unanticipated changes may result in the inability of the system to meet its quality requirements. To allow the system to react to these changes, this article proposes a solution for automatically evolving the system model by integrating new variants and periodically validating the existing ones based on updated quality parameters. To illustrate this approach, the article presents a BPEL-based framework using a service composition model to represent the functional requirements of the system. The framework estimates quality of service (QoS) values based on information provided by a monitoring mechanism, ensuring that changes in QoS are reflected in the system model. The article shows how the evolved model can be used at runtime to increase the system's autonomic capabilities and delivered QoS.

  18. Design of automated system for management of arrival traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1989-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. The design of two of these tools, the Descent Advisor, which provides automation tools for managing descent traffic, and the Traffic Management Advisor, which generates optimum landing schedules is focused on. The algorithms, automation modes, and graphical interfaces incorporated in the design are described.

  19. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  20. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  1. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  2. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  3. Designing for Productive Adaptations of Curriculum Interventions

    ERIC Educational Resources Information Center

    Debarger, Angela Haydel; Choppin, Jeffrey; Beauvineau, Yves; Moorthy, Savitha

    2013-01-01

    Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and…

  4. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  5. Drought Adaptation Mechanisms Should Guide Experimental Design.

    PubMed

    Gilbert, Matthew E; Medina, Viviana

    2016-08-01

    The mechanism, or hypothesis, of how a plant might be adapted to drought should strongly influence experimental design. For instance, an experiment testing for water conservation should be distinct from a damage-tolerance evaluation. We define here four new, general mechanisms for plant adaptation to drought such that experiments can be more easily designed based upon the definitions. A series of experimental methods are suggested together with appropriate physiological measurements related to the drought adaptation mechanisms. The suggestion is made that the experimental manipulation should match the rate, length, and severity of soil water deficit (SWD) necessary to test the hypothesized type of drought adaptation mechanism. PMID:27090148

  6. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  7. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  8. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1972-01-01

    A survey was made of the literature devoted to the synthesis of model-tracking adaptive systems based on application of Liapunov's second method. The basic synthesis procedure is introduced and a critical review of extensions made to the theory since 1966 is made. The extensions relate to design for relative stability, reduction of order techniques, design with disturbance, design with time variable parameters, multivariable systems, identification, and an adaptive observer.

  9. Design of Center-TRACON Automation System

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Davis, Thomas J.; Green, Steven

    1993-01-01

    A system for the automated management and control of terminal area traffic, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA Ames Research Center. In a cooperative program, NASA and FAA have efforts underway to install and evaluate the system at the Denver area and Dallas/Ft. Worth area air traffic control facilities. This paper will review CTAS architecture, and automation functions as well as the integration of CTAS into the existing operational system. CTAS consists of three types of integrated tools that provide computer-generated advisories for both en-route and terminal area controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), generates runway assignments, landing sequences and landing times for all arriving aircraft, including those originating from nearby feeder airports. TMA also assists in runway configuration control and flow management. Another tool, the Descent Advisor (DA), generates clearances for the en-route controllers handling arrival flows to metering gates. The DA's clearances ensure fuel-efficient and conflict free descents to the metering gates at specified crossing times. In the terminal area, the Final Approach Spacing Tool (FAST) provides heading and speed advisories that help controllers produce an accurately spaced flow of aircraft on the final approach course. Data bases consisting of several hundred aircraft performance models, airline preferred operational procedures, and a three dimensional wind model support the operation of CTAS. The first component of CTAS, the Traffic Management Advisor, is being evaluated at the Denver TRACON and the Denver Air Route Traffic Control Center. The second component, the Final Approach Spacing Tool, will be evaluated in several stages at the Dallas/Fort Worth Airport beginning in October 1993. An initial stage of the Descent Advisor tool is being prepared for testing at the Denver Center

  10. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  11. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    SciTech Connect

    Williams, Joshua M.

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates

  12. Design of the Automated Rendezvous and Capture Docking System

    NASA Technical Reports Server (NTRS)

    Cruzen, Craig A.; Lomas, James J.

    1999-01-01

    This paper describes the Automated Rendezvous and Capture (AR&C) system that was designed and is being tested at NASA's Marshall Space Flight Center (MSFC). The AR&C system incorporates some of the latest innovations in Global Positioning System (GPS), laser sensor technologies and automated mission sequencing algorithms as well as the capability for ground and crew monitoring and commanding. This paper summarizes the variety of mission scenarios supported by the AR&C system. It also describes the major components of the AR&C system including the Guidance, Navigation and Control system, GPS receivers, relative navigation filter and the Video Guidance Sensor. A discussion of the safety and reliability issues confronted during the design follows. By designing a safe and robust automated system, space mission operations cost can be reduced by decreasing the number of ground personnel required for the extensive mission design, preflight planning and training typically required for rendezvous and docking missions.

  13. Design automation for complex CMOS/SOS LSI hybrid substrates

    NASA Technical Reports Server (NTRS)

    Ramondetta, P. W.; Smiley, J. W.

    1976-01-01

    A design automated approach used to develop thick-film hybrid packages is described. The hybrid packages produced combine thick-film and silicon on sapphire (SOS) laser surface interaction technologies to bring the on-chip performance level of SOS to the subsystem level. Packing densities are improved by a factor of eight over ceramic dual in-line packing; interchip wiring capacitance is low. Due to significant time savings, the design automated approach presented can be expected to yield a 3:1 reduction in cost over the use of manual methods for the initial design of a hybrid.

  14. Adapting Assessment Procedures for Delivery via an Automated Format.

    ERIC Educational Resources Information Center

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  15. Automated AI-based designer of electrical distribution systems

    NASA Astrophysics Data System (ADS)

    Sumic, Zarko

    1992-03-01

    Designing the electrical supply system for new residential developments (plat design) is an everyday task for electric utility engineers. Presently this task is carried out manually resulting in an overdesigned, costly, and nonstandardized solution. As an ill-structured and open-ended problem, plat design is difficult to automate with conventional approaches such as operational research or CAD. Additional complexity in automating plat design is imposed by the need to process spatial data such as circuits' maps, records, and construction plans. The intelligent decision support system for automated electrical plate design (IDSS for AEPD) is an engineering tool aimed at automating plate design. IDSS for AEPD combines the functionality of geographic information systems (GIS) a geographically referenced database, with the sophistication of artificial intelligence (AI) to deal with the complexity inherent in design problems. Blackboard problem solving architecture, concentrated around INGRES relational database and NEXPERT object expert system shell have been chosen to accommodate the diverse knowledge sources and data models. The GIS's principal task it to create, structure, and formalize the real world representation required by the rule based reasoning portion of the AEPD. IDSS's capability to support and enhance the engineer's design, rather than only automate the design process through a prescribed computation, makes it a preferred choice among the possible techniques for AEPD. This paper presents the results of knowledge acquisition and the knowledge engineering process with AEPD tool conceptual design issues. To verify the proposed concept, the comparison of results obtained by the AEPD tool with the design obtained by an experienced human designer is given.

  16. Automated IDEF3 and IDEF4 systems design specification document

    NASA Technical Reports Server (NTRS)

    Friel, Patricia Griffith; Blinn, Thomas M.

    1989-01-01

    The current design is presented for the automated IDEF3 and IDEF4 tools. The philosophy is described behind the tool designs as well as the conceptual view of the interacting components of the two tools. Finally, a detailed description is presented of the existing designs for the tools using IDEF3 process descriptions and IDEF4 diagrams. In the preparation of these designs, the IDEF3 and IDEF4 methodologies were very effective in defining the structure and operation of the tools. The experience in designing systems in this fashion was very valuable and resulted in future systems being designed in this way. However, the number of IDEF3 and IDEF4 diagrams that were produced using a Macintosh for this document attest to the need for an automated tool to simplify this design process.

  17. How the new optoelectronic design automation industry is taking advantage of preexisting EDA standards

    NASA Astrophysics Data System (ADS)

    Nesmith, Kevin A.; Carver, Susan

    2014-05-01

    With the advancements in design processes down to the sub 7nm levels, the Electronic Design Automation industry appears to be coming to an end of advancements, as the size of the silicon atom becomes the limiting factor. Or is it? The commercial viability of mass-producing silicon photonics is bringing about the Optoelectronic Design Automation (OEDA) industry. With the science of photonics in its infancy, adding these circuits to ever-increasing complex electronic designs, will allow for new generations of advancements. Learning from the past 50 years of the EDA industry's mistakes and missed opportunities, the photonics industry is starting with electronic standards and extending them to become photonically aware. Adapting the use of pre-existing standards into this relatively new industry will allow for easier integration into the present infrastructure and faster time to market.

  18. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  19. Dynamic optimization and adaptive controller design

    NASA Astrophysics Data System (ADS)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  20. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  1. Conceptual design of an aircraft automated coating removal system

    SciTech Connect

    Baker, J.E.; Draper, J.V.; Pin, F.G.; Primm, A.H.; Shekhar, S.

    1996-05-01

    Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which is semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).

  2. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  3. A Case Study in CAD Design Automation

    ERIC Educational Resources Information Center

    Lowe, Andrew G.; Hartman, Nathan W.

    2011-01-01

    Computer-aided design (CAD) software and other product life-cycle management (PLM) tools have become ubiquitous in industry during the past 20 years. Over this time they have continuously evolved, becoming programs with enormous capabilities, but the companies that use them have not evolved their design practices at the same rate. Due to the…

  4. Generative Representations for Computer-Automated Design Systems

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.

  5. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longuski, James M.; Bonfiglio, Eugene P.; Taylor, Irene (Technical Monitor)

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V_ for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites. hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  6. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longusaki, James M.; Bonfiglio, Eugene P.

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa, Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V(sub infinity) for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites, hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  7. Automated radiation hard ASIC design tool

    NASA Technical Reports Server (NTRS)

    White, Mike; Bartholet, Bill; Baze, Mark

    1993-01-01

    A commercial based, foundry independent, compiler design tool (ChipCrafter) with custom radiation hardened library cells is described. A unique analysis approach allows low hardness risk for Application Specific IC's (ASIC's). Accomplishments, radiation test results, and applications are described.

  8. CMOS-array design-automation techniques

    NASA Technical Reports Server (NTRS)

    Feller, A.; Lombardt, T.

    1979-01-01

    Thirty four page report discusses design of 4,096-bit complementary metal oxide semiconductor (CMOS) read-only memory (ROM). CMOSROM is either mask or laser programable. Report is divided into six sections; section one describes background of ROM chips; section two presents design goals for chip; section three discusses chip implementation and chip statistics; conclusions and recommendations are given in sections four thru six.

  9. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  10. DESIGN OF SMALL AUTOMATION WORK CELL SYSTEM DEMONSTRATIONS

    SciTech Connect

    C. TURNER; J. PEHL; ET AL

    2000-12-01

    The introduction of automation systems into many of the facilities dealing with the production, use and disposition of nuclear materials has been an ongoing objective. Many previous attempts have been made, using a variety of monolithic and, in some cases, modular technologies. Many of these attempts were less than successful, owing to the difficulty of the problem, the lack of maturity of the technology, and over optimism about the capabilities of a particular system. Consequently, it is not surprising that suggestions that automation can reduce worker Occupational Radiation Exposure (ORE) levels are often met with skepticism and caution. The development of effective demonstrations of these technologies is of vital importance if automation is to become an acceptable option for nuclear material processing environments. The University of Texas Robotics Research Group (UTRRG) has been pursuing the development of technologies to support modular small automation systems (each of less than 5 degrees-of-freedom) and the design of those systems for more than two decades. Properly designed and implemented, these technologies have a potential to reduce the worker ORE associated with work in nuclear materials processing facilities. Successful development of systems for these applications requires the development of technologies that meet the requirements of the applications. These application requirements form a general set of rules that applicable technologies and approaches need to adhere to, but in and of themselves are generally insufficient for the design of a specific automation system. For the design of an appropriate system, the associated task specifications and relationships need to be defined. These task specifications also provide a means by which appropriate technology demonstrations can be defined. Based on the requirements and specifications of the operations of the Advanced Recovery and Integrated Extraction System (ARIES) pilot line at Los Alamos National

  11. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  12. Automated database design from natural language input

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos; Delaune, Carl

    1995-01-01

    Users and programmers of small systems typically do not have the skills needed to design a database schema from an English description of a problem. This paper describes a system that automatically designs databases for such small applications from English descriptions provided by end-users. Although the system has been motivated by the space applications at Kennedy Space Center, and portions of it have been designed with that idea in mind, it can be applied to different situations. The system consists of two major components: a natural language understander and a problem-solver. The paper describes briefly the knowledge representation structures constructed by the natural language understander, and, then, explains the problem-solver in detail.

  13. Theoretical considerations in designing operator interfaces for automated systems

    NASA Technical Reports Server (NTRS)

    Norman, Susan D.

    1987-01-01

    The domains most amenable to techniques based on artificial intelligence (AI) are those that are systematic or for which a systematic domain can be generated. In aerospace systems, many operational tasks are systematic owing to the highly procedural nature of the applications. However, aerospace applications can also be nonprocedural, particularly in the event of a failure or an unexpected event. Several techniques are discussed for designing automated systems for real-time, dynamic environments, particularly when a 'breakdown' occurs. A breakdown is defined as operation of an automated system outside its predetermined, conceptual domain.

  14. Towards automation of user interface design

    NASA Technical Reports Server (NTRS)

    Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst

    1992-01-01

    This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.

  15. Library Automation Design for Visually Impaired People

    ERIC Educational Resources Information Center

    Yurtay, Nilufer; Bicil, Yucel; Celebi, Sait; Cit, Guluzar; Dural, Deniz

    2011-01-01

    Speech synthesis is a technology used in many different areas in computer science. This technology can bring a solution to reading activity of visually impaired people due to its text to speech conversion. Based on this problem, in this study, a system is designed needed for a visually impaired person to make use of all the library facilities in…

  16. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  17. A comparative survey of non-adaptive pooling designs

    SciTech Connect

    Balding, D.J.; Bruno, W.J.; Torney, D.C.

    1996-12-31

    Pooling (or {open_quotes}group testing{close_quotes}) designs for screening clone libraries for rare {open_quotes}positives{close_quotes} are described and compared. We focus on non-adaptive designs in which, in order both to facilitate automation and to minimize the total number of pools required in multiple screenings, all the pools are specified in advance of the experiments. The designs considered include deterministic designs, such as set-packing designs, the widely-used {open_quotes}row and column{close_quotes} designs and the more general {open_quotes}transversal{close_quotes} designs, as well as random designs such as {open_quotes}random incidence{close_quotes} and {open_quotes}random k-set{close_quotes} designs. A range of possible performance measures is considered, including the expected numbers of unresolved positive and negative clones, and the probability of a one-pass solution. We describe a flexible strategy in which the experimenter chooses a compromise between the random k-set and the set-packing designs. In general, the latter have superior performance while the former are nearly as efficient and are easier to construct. 39 refs., 1 fig., 4 tabs.

  18. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  19. Design of automation tools for management of descent traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  20. Automated Adaptive Brightness in Wireless Capsule Endoscopy Using Image Segmentation and Sigmoid Function.

    PubMed

    Shrestha, Ravi; Mohammed, Shahed K; Hasan, Md Mehedi; Zhang, Xuechao; Wahid, Khan A

    2016-08-01

    Wireless capsule endoscopy (WCE) plays an important role in the diagnosis of gastrointestinal (GI) diseases by capturing images of human small intestine. Accurate diagnosis of endoscopic images depends heavily on the quality of captured images. Along with image and frame rate, brightness of the image is an important parameter that influences the image quality which leads to the design of an efficient illumination system. Such design involves the choice and placement of proper light source and its ability to illuminate GI surface with proper brightness. Light emitting diodes (LEDs) are normally used as sources where modulated pulses are used to control LED's brightness. In practice, instances like under- and over-illumination are very common in WCE, where the former provides dark images and the later provides bright images with high power consumption. In this paper, we propose a low-power and efficient illumination system that is based on an automated brightness algorithm. The scheme is adaptive in nature, i.e., the brightness level is controlled automatically in real-time while the images are being captured. The captured images are segmented into four equal regions and the brightness level of each region is calculated. Then an adaptive sigmoid function is used to find the optimized brightness level and accordingly a new value of duty cycle of the modulated pulse is generated to capture future images. The algorithm is fully implemented in a capsule prototype and tested with endoscopic images. Commercial capsules like Pillcam and Mirocam were also used in the experiment. The results show that the proposed algorithm works well in controlling the brightness level accordingly to the environmental condition, and as a result, good quality images are captured with an average of 40% brightness level that saves power consumption of the capsule. PMID:27333609

  1. Automated interferometric synthetic aperture microscopy and computational adaptive optics for improved optical coherence tomography.

    PubMed

    Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott

    2016-03-10

    In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered. PMID:26974799

  2. Flexible receiver adapter formal design review

    SciTech Connect

    Krieg, S.A.

    1995-06-13

    This memo summarizes the results of the Formal (90%) Design Review process and meetings held to evaluate the design of the Flexible Receiver Adapters, support platforms, and associated equipment. The equipment is part of the Flexible Receiver System used to remove, transport, and store long length contaminated equipment and components from both the double and single-shell underground storage tanks at the 200 area tank farms.

  3. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Automated design of multiple encounter gravity-assist trajectories

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1990-01-01

    Given a range of initial launch dates and a set of target planets, a new approach to planetary mission design is developed, using an automated method for finding all conic solutions. Each point on the diagrams reproduced represents a single ballistic trajectory and is computed by modeling the trajectory as a conic section and solving the corresponding Lambert problem for each set of launch and arrival dates. An example which prescribes a launch period of 1975-2050 and target planets Uranus, Saturn, Jupiter, Neptune and Pluto is described whereby, all possible grand tour missions of this class are found, including the Voyager II trajectory. It is determined that this automated design tool may be applied to a variety of multiple encounter gravity-assist trajectories that are being considered for future missions.

  5. Automated and Adaptive Mission Planning for Orbital Express

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Knight, Russell; Jones, Grailing; Tran, Daniel; Koblick, Darin

    2008-01-01

    The Orbital Express space mission was a Defense Advanced Research Projects Agency (DARPA) lead demonstration of on-orbit satellite servicing scenarios, autonomous rendezvous, fluid transfers of hydrazine propellant, and robotic arm transfers of Orbital Replacement Unit (ORU) components. Boeing's Autonomous Space Transport Robotic Operations (ASTRO) vehicle provided the servicing to the Ball Aerospace's Next Generation Serviceable Satellite (NextSat) client. For communication opportunities, operations used the high-bandwidth ground-based Air Force Satellite Control Network (AFSCN) along with the relatively low-bandwidth GEO-Synchronous space-borne Tracking and Data Relay Satellite System (TDRSS) network. Mission operations were conducted out of the RDT&E Support Complex (RSC) at the Kirtland Air Force Base in New Mexico. All mission objectives were met successfully: The first of several autonomous rendezvous was demonstrated on May 5, 2007; autonomous free-flyer capture was demonstrated on June 22, 2007; the fluid and ORU transfers throughout the mission were successful. Planning operations for the mission were conducted by a team of personnel including Flight Directors, who were responsible for verifying the steps and contacts within the procedures, the Rendezvous Planners who would compute the locations and visibilities of the spacecraft, the Scenario Resource Planners (SRPs), who were concerned with assignment of communications windows, monitoring of resources, and sending commands to the ASTRO spacecraft, and the Mission planners who would interface with the real-time operations environment, process planning products and coordinate activities with the SRP. The SRP position was staffed by JPL personnel who used the Automated Scheduling and Planning ENvironment (ASPEN) to model and enforce mission and satellite constraints. The lifecycle of a plan began three weeks outside its execution on-board. During the planning timeframe, many aspects could change the plan

  6. Computerized Adaptive Testing System Design: Preliminary Design Considerations.

    ERIC Educational Resources Information Center

    Croll, Paul R.

    A functional design model for a computerized adaptive testing (CAT) system was developed and presented through a series of hierarchy plus input-process-output (HIPO) diagrams. System functions were translated into system structure: specifically, into 34 software components. Implementation of the design in a physical system was addressed through…

  7. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  8. Applying Utility Functions to Adaptation Planning for Home Automation Applications

    NASA Astrophysics Data System (ADS)

    Bratskas, Pyrros; Paspallis, Nearchos; Kakousis, Konstantinos; Papadopoulos, George A.

    A pervasive computing environment typically comprises multiple embedded devices that may interact together and with mobile users. These users are part of the environment, and they experience it through a variety of devices embedded in the environment. This perception involves technologies which may be heterogeneous, pervasive, and dynamic. Due to the highly dynamic properties of such environments, the software systems running on them have to face problems such as user mobility, service failures, or resource and goal changes which may happen in an unpredictable manner. To cope with these problems, such systems must be autonomous and self-managed. In this chapter we deal with a special kind of a ubiquitous environment, a smart home environment, and introduce a user-preference-based model for adaptation planning. The model, which dynamically forms a set of configuration plans for resources, reasons automatically and autonomously, based on utility functions, on which plan is likely to best achieve the user's goals with respect to resource availability and user needs.

  9. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    NASA Technical Reports Server (NTRS)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  10. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  11. Numerical design of an adaptive aileron

    NASA Astrophysics Data System (ADS)

    Amendola, Gianluca; Dimino, Ignazio; Concilio, Antonio; Magnifico, Marco; Pecora, Rosario

    2016-04-01

    The study herein described is aimed at investigating the feasibility of an innovative full-scale camber morphing aileron device. In the framework of the "Adaptive Aileron" project, an international cooperation between Italy and Canada, this goal was carried out with the integration of different morphing concepts in a wing-tip prototype. As widely demonstrated in recent European projects such as Clean Sky JTI and SARISTU, wing trailing edge morphing may lead to significant drag reduction (up to 6%) in off-design flight points by adapting chord-wise camber variations in cruise to compensate A/C weight reduction following fuel consumption. Those researches focused on the flap region as the most immediate solution to implement structural adaptations. However, there is also a growing interest in extending morphing functionalities to the aileron region preserving its main functionality in controlling aircraft directional stability. In fact, the external region of the wing seems to be the most effective in producing "lift over drag" improvements by morphing. Thus, the objective of the presented research is to achieve a certain drag reduction in off-design flight points by adapting wing shape and lift distribution following static deflections. In perspective, the developed device could also be used as a load alleviation system to reduce gust effects, augmenting its frequency bandwidth. In this paper, the preliminary design of the adaptive aileron is first presented, assessed on the base of the external aerodynamic loads. The primary structure is made of 5 segmented ribs, distributed along 4 bays, each splitted into three consecutive parts, connected with spanwise stringers. The aileron shape modification is then implemented by means of an actuation system, based on a classical quick-return mechanism, opportunely suited for the presented application. Finite element analyses were assessed for properly sizing the load-bearing structure and actuation systems and for

  12. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  13. An automated quality assessor for Ada object-oriented designs

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.

    1988-01-01

    A tool for evaluating object-oriented designs (OODs) for Ada software is described. The tool assumes a design expressed as a hierarchy of object diagrams. A design of this type identifies the objects of a system, an interface to each object, and the usage relationships between objects. When such a design is implemented in Ada, objects become packages, interfaces become package specifications, and usage relationships become Ada `with' clauses and package references. An automated quality assessor has been developed that is based on flagging undesirable design constructs. For convenience, distinctions are made among three levels of severity: questionable, undesirable, and hazardous. A questionable construct is one that may well be appropriate. An undesirable construct is one that should be changed because it is potentially harmful to the reliability, maintainability, or reusability of the software. A hazardous construct is one that is undesirable and that introduces a high level of risk.

  14. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  15. Electronic Design Automation (EDA) Roadmap Taskforce Report, Design of Microprocessors

    NASA Astrophysics Data System (ADS)

    1999-04-01

    The goal of this project was to support the establishment of tool interoperability standards for the semiconductor industry. Accomplishments include the publication of the 'EDA Industry Standards Roadmap - 1996' and the 'EDA Roadmap Taskforce Report - Design of Microprocessors.'

  16. Adaptive strategies for materials design using uncertainties

    DOE PAGESBeta

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  17. Adaptive Strategies for Materials Design using Uncertainties

    NASA Astrophysics Data System (ADS)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  18. Adaptive Strategies for Materials Design using Uncertainties

    PubMed Central

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties. PMID:26792532

  19. Adaptive Mesh Refinement for Microelectronic Device Design

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Lou, John; Norton, Charles

    1999-01-01

    Finite element and finite volume methods are used in a variety of design simulations when it is necessary to compute fields throughout regions that contain varying materials or geometry. Convergence of the simulation can be assessed by uniformly increasing the mesh density until an observable quantity stabilizes. Depending on the electrical size of the problem, uniform refinement of the mesh may be computationally infeasible due to memory limitations. Similarly, depending on the geometric complexity of the object being modeled, uniform refinement can be inefficient since regions that do not need refinement add to the computational expense. In either case, convergence to the correct (measured) solution is not guaranteed. Adaptive mesh refinement methods attempt to selectively refine the region of the mesh that is estimated to contain proportionally higher solution errors. The refinement may be obtained by decreasing the element size (h-refinement), by increasing the order of the element (p-refinement) or by a combination of the two (h-p refinement). A successful adaptive strategy refines the mesh to produce an accurate solution measured against the correct fields without undue computational expense. This is accomplished by the use of a) reliable a posteriori error estimates, b) hierarchal elements, and c) automatic adaptive mesh generation. Adaptive methods are also useful when problems with multi-scale field variations are encountered. These occur in active electronic devices that have thin doped layers and also when mixed physics is used in the calculation. The mesh needs to be fine at and near the thin layer to capture rapid field or charge variations, but can coarsen away from these layers where field variations smoothen and charge densities are uniform. This poster will present an adaptive mesh refinement package that runs on parallel computers and is applied to specific microelectronic device simulations. Passive sensors that operate in the infrared portion of

  20. An hierarchical system architecture for automated design, fabrication, and repair

    NASA Technical Reports Server (NTRS)

    Cliff, R. A.

    1981-01-01

    The architecture of an automated system which has the following properties is described: (1) if it is presented with a final product specification (within its capabilities) it will do the detailed design (all the way down to the raw materials if need be) and then produce that product; (2) if a faulty final product is presented to the system, it will repair it. Interesting extensions of this architecture would be the ability to add fabricator nodes when required and the ability to add entire ranks when required. This sort of system would be a useful component of a self-replicating system (used in space exploration).

  1. An automated approach to magnetic divertor configuration design

    NASA Astrophysics Data System (ADS)

    Blommaert, M.; Dekeyser, W.; Baelmans, M.; Gauger, N. R.; Reiter, D.

    2015-01-01

    Automated methods based on optimization can greatly assist computational engineering design in many areas. In this paper an optimization approach to the magnetic design of a nuclear fusion reactor divertor is proposed and applied to a tokamak edge magnetic configuration in a first feasibility study. The approach is based on reduced models for magnetic field and plasma edge, which are integrated with a grid generator into one sensitivity code. The design objective chosen here for demonstrative purposes is to spread the divertor target heat load as much as possible over the entire target area. Constraints on the separatrix position are introduced to eliminate physically irrelevant magnetic field configurations during the optimization cycle. A gradient projection method is used to ensure stable cost function evaluations during optimization. The concept is applied to a configuration with typical Joint European Torus (JET) parameters and it automatically provides plausible configurations with reduced heat load.

  2. Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi

    2013-03-01

    Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.

  3. Factors in the design of adaptive antennas

    NASA Astrophysics Data System (ADS)

    Barton, P.

    A brief review of adaptive antenna technology is given, and two topic areas are addressed. The first is concerned with the general difficulties encountered in design, in particular the avoidance of nulling wanted signals, the provision of an adequate rate of convergence towards a desired characteristic, and the degradation of null depths caused by the proximity of the platform and by dispersion in the array and receiving channels. The second topic concerns specific design approaches. Closed loop processors, in which the array output is sensed in order to provide a drive to the weight networks, are exemplified by a feedback loop correlator design and a weight perturbation technique. An example of open-loop control of weight values is also included, and its lack of self-correction is shown to be disadvantageous compared to the closed loop approach. Advanced methods, associated with sample matrix inversion, are also summarized.

  4. A Tutorial on Adaptive Design Optimization

    PubMed Central

    Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.

    2013-01-01

    Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275

  5. Automated Microscopy: Macro Language Controlling a Confocal Microscope and its External Illumination: Adaptation for Photosynthetic Organisms.

    PubMed

    Steinbach, Gábor; Kaňa, Radek

    2016-04-01

    Photosynthesis research employs several biophysical methods, including the detection of fluorescence. Even though fluorescence is a key method to detect photosynthetic efficiency, it has not been applied/adapted to single-cell confocal microscopy measurements to examine photosynthetic microorganisms. Experiments with photosynthetic cells may require automation to perform a large number of measurements with different parameters, especially concerning light conditions. However, commercial microscopes support custom protocols (through Time Controller offered by Olympus or Experiment Designer offered by Zeiss) that are often unable to provide special set-ups and connection to external devices (e.g., for irradiation). Our new system combining an Arduino microcontroller with the Cell⊕Finder software was developed for controlling Olympus FV1000 and FV1200 confocal microscopes and the attached hardware modules. Our software/hardware solution offers (1) a text file-based macro language to control the imaging functions of the microscope; (2) programmable control of several external hardware devices (light sources, thermal controllers, actuators) during imaging via the Arduino microcontroller; (3) the Cell⊕Finder software with ergonomic user environment, a fast selection method for the biologically important cells and precise positioning feature that reduces unwanted bleaching of the cells by the scanning laser. Cell⊕Finder can be downloaded from http://www.alga.cz/cellfinder. The system was applied to study changes in fluorescence intensity in Synechocystis sp. PCC6803 cells under long-term illumination. Thus, we were able to describe the kinetics of phycobilisome decoupling. Microscopy data showed that phycobilisome decoupling appears slowly after long-term (>1 h) exposure to high light. PMID:27050040

  6. Adaptive design of visual perception experiments

    NASA Astrophysics Data System (ADS)

    O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja

    2010-04-01

    Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.

  7. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    NASA Technical Reports Server (NTRS)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  8. Dual adaptive control: Design principles and applications

    NASA Technical Reports Server (NTRS)

    Mookerjee, Purusottam

    1988-01-01

    The design of an actively adaptive dual controller based on an approximation of the stochastic dynamic programming equation for a multi-step horizon is presented. A dual controller that can enhance identification of the system while controlling it at the same time is derived for multi-dimensional problems. This dual controller uses sensitivity functions of the expected future cost with respect to the parameter uncertainties. A passively adaptive cautious controller and the actively adaptive dual controller are examined. In many instances, the cautious controller is seen to turn off while the latter avoids the turn-off of the control and the slow convergence of the parameter estimates, characteristic of the cautious controller. The algorithms have been applied to a multi-variable static model which represents a simplified linear version of the relationship between the vibration output and the higher harmonic control input for a helicopter. Monte Carlo comparisons based on parametric and nonparametric statistical analysis indicate the superiority of the dual controller over the baseline controller.

  9. Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning

    ERIC Educational Resources Information Center

    Thomas, M. S.; Kothari, D. P.; Prakash, A.

    2011-01-01

    Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…

  10. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  11. Valuation of design adaptability in aerospace systems

    NASA Astrophysics Data System (ADS)

    Fernandez Martin, Ismael

    As more information is brought into early stages of the design, more pressure is put on engineers to produce a reliable, high quality, and financially sustainable product. Unfortunately, requirements established at the beginning of a new project by customers, and the environment that surrounds them, continue to change in some unpredictable ways. The risk of designing a system that may become obsolete during early stages of production is currently tackled by the use of robust design simulation, a method that allows to simultaneously explore a plethora of design alternatives and requirements with the intention of accounting for uncertain factors in the future. Whereas this design technique has proven to be quite an improvement in design methods, under certain conditions, it fails to account for the change of uncertainty over time and the intrinsic value embedded in the system when certain design features are activated. This thesis introduces the concepts of adaptability and real options to manage risk foreseen in the face of uncertainty at early design stages. The method described herein allows decision-makers to foresee the financial impact of their decisions at the design level, as well as the final exposure to risk. In this thesis, cash flow models, traditionally used to obtain the forecast of a project's value over the years, were replaced with surrogate models that are capable of showing fluctuations on value every few days. This allowed a better implementation of real options valuation, optimization, and strategy selection. Through the option analysis model, an optimization exercise allows the user to obtain the best implementation strategy in the face of uncertainty as well as the overall value of the design feature. Here implementation strategy refers to the decision to include a new design feature in the system, after the design has been finalized, but before the end of its production life. The ability to do this in a cost efficient manner after the system

  12. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  13. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  14. The design of an automated electrolytic enrichment apparatus for tritium

    SciTech Connect

    Myers, J.L.

    1994-12-01

    The Radiation Analytical Sciences Section at Laboratory at Lawrence Livermore National Laboratory performs analysis of low-level tritium concentrations in various natural water samples from the Tri-Valley Area, DOE Nevada Test Site, Site 300 in Tracy, CA, and other various places around the world. Low levels of tritium, a radioactive isotope of hydrogen, which is pre-concentrated in the RAS laboratory using an electrolytic enrichment apparatus. Later these enriched waters are analyzed by liquid scintillation counting to determine the activity of tritium. The enrichment procedure and the subsequent purification process by vacuum distillation are currently undertaken manually, hence being highly labor-intensive. The whole process typically takes about 2 to 3 weeks to complete a batch of 30 samples, with a dedicated personnel operating the process. The goal is to automate the entire process, specifically having the operation PC-LabVIEW{trademark} controlled with real-time monitoring capability. My involvement was in the design and fabrication of a prototypical automated electrolytic enrichment cell. Work will be done on optimizing the electrolytic process by assessing the different parameters of the enrichment procedure. Hardware and software development have also been an integral component of this project.

  15. Automated Design of Restraint Layer of an Inflatable Vessel

    NASA Technical Reports Server (NTRS)

    Spexarth, Gary

    2007-01-01

    A Mathcad computer program largely automates the design and analysis of the restraint layer (the primary load-bearing layer) of an inflatable vessel that consists of one or more sections having cylindrical, toroidal, and/or spherical shape(s). A restraint layer typically comprises webbing in the form of multiple straps. The design task includes choosing indexing locations along the straps, computing the load at every location in each strap, computing the resulting stretch at each location, and computing the amount of undersizing required of each strap so that, once the vessel is inflated and the straps thus stretched, the vessel can be expected to assume the desired shape. Prior to the development of this program, the design task was performed by use of a difficult-to-use spreadsheet program that required manual addition of rows and columns depending on the numbers of strap rows and columns of a given design. In contrast, this program is completely parametric and includes logic that automatically adds or deletes rows and columns as needed. With minimal input from the user, this program automatically computes indexing locations, strap lengths, undersizing requirements, and all design data required to produce detailed drawings and assembly procedures. It also generates textual comments that help the user understand the calculations.

  16. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  17. Effects of extended lay-off periods on performance and operator trust under adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-03-01

    Little is known about the long-term effects of system reliability when operators do not use a system during an extended lay-off period. To examine threats to skill maintenance, 28 participants operated twice a simulation of a complex process control system for 2.5 h, with an 8-month retention interval between sessions. Operators were provided with an adaptable support system, which operated at one of the following reliability levels: 60%, 80% or 100%. Results showed that performance, workload, and trust remained stable at the second testing session, but operators lost self-confidence in their system management abilities. Finally, the effects of system reliability observed at the first testing session were largely found again at the second session. The findings overall suggest that adaptable automation may be a promising means to support operators in maintaining their performance at the second testing session. PMID:26603139

  18. [Design of an Incremental and Open Laboratory Automation System].

    PubMed

    Xie, Chuanfen; Chen, Yueping; Wang, Zhihong

    2015-07-01

    Recent years have witnessed great development of TLA (Total Laboratory Automation) technology, however, its application hit the bottleneck of high cost and openess to other parties' instruments. Specifically speaking, the initial purchase of the medical devices requires large sum of money and the new system can hardly be compatible with existing equipment. This thesis proposes a new thought for system implementation that through incremental upgrade, the initial capital investment can be reduced and through open architecture and interfaces, the seamless connection of different devices can be achieved. This thesis elaborates on the standards that open architecture design should follow in aspect of mechanics, electro-communication and information interaction and the key technology points in system implementation. PMID:26665947

  19. Embedded design based virtual instrument program for positron beam automation

    NASA Astrophysics Data System (ADS)

    Jayapandian, J.; Gururaj, K.; Abhaya, S.; Parimala, J.; Amarendra, G.

    2008-10-01

    Automation of positron beam experiment with a single chip embedded design using a programmable system on chip (PSoC) which provides easy interfacing of the high-voltage DC power supply is reported. Virtual Instrument (VI) control program written in Visual Basic 6.0 ensures the following functions (i) adjusting of sample high voltage by interacting with the programmed PSoC hardware, (ii) control of personal computer (PC) based multi channel analyzer (MCA) card for energy spectroscopy, (iii) analysis of the obtained spectrum to extract the relevant line shape parameters, (iv) plotting of relevant parameters and (v) saving the file in the appropriate format. The present study highlights the hardware features of the PSoC hardware module as well as the control of MCA and other units through programming in Visual Basic.

  20. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers

    NASA Astrophysics Data System (ADS)

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C.; Markley, John L.

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-13C, U-15N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D 1H-15N and 1H-13C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of 1H, 13C, and 15N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use.

  1. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers.

    PubMed

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C; Markley, John L

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-(13)C, U-(15)N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D (1)H-(15)N and (1)H-(13)C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of (1)H, (13)C, and (15)N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use. PMID:24091140

  2. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    ERIC Educational Resources Information Center

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  3. Automated design of multiphase space missions using hybrid optimal control

    NASA Astrophysics Data System (ADS)

    Chilan, Christian Miguel

    A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving the associated continuous optimal control problem. This strategy, however, will most likely yield a sub-optimal solution, as the problem is sophisticated for several reasons. For example, the number of events in the optimal mission structure is not known a priori and the system equations of motion change depending on what event is current. In this work a framework for the automated design of multiphase space missions is presented using hybrid optimal control (HOC). The method developed uses two nested loops: an outer-loop that handles the discrete dynamics and finds the optimal mission structure in terms of the categorical variables, and an inner-loop that performs the optimization of the corresponding continuous-time dynamical system and obtains the required control history. Genetic algorithms (GA) and direct transcription with nonlinear programming (NLP) are introduced as methods of solution for the outer-loop and inner-loop problems, respectively. Automation of the inner-loop, continuous optimal control problem solver, required two new technologies. The first is a method for the automated construction of the NLP problems resulting from the use of a direct solver for systems with different structures, including different numbers of categorical events. The method assembles modules, consisting of parameters and constraints appropriate to each event, sequentially according to the given mission structure. The other new technology is for a robust initial guess generator required by the inner-loop NLP problem solver. Two new methods were developed for cases including low-thrust trajectories. The first method, based on GA

  4. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  5. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, P.; Beaudet, P.

    1980-01-01

    The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.

  6. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  7. The automated design of materials far from equilibrium

    NASA Astrophysics Data System (ADS)

    Miskin, Marc Z.

    Automated design is emerging as a powerful concept in materials science. By combining computer algorithms, simulations, and experimental data, new techniques are being developed that start with high level functional requirements and identify the ideal materials that achieve them. This represents a radically different picture of how materials become functional in which technological demand drives material discovery, rather than the other way around. At the frontiers of this field, materials systems previously considered too complicated can start to be controlled and understood. Particularly promising are materials far from equilibrium. Material robustness, high strength, self-healing and memory are properties displayed by several materials systems that are intrinsically out of equilibrium. These and other properties could be revolutionary, provided they can first be controlled. This thesis conceptualizes and implements a framework for designing materials that are far from equilibrium. We show how, even in the absence of a complete physical theory, design from the top down is possible and lends itself to producing physical insight. As a prototype system, we work with granular materials: collections of athermal, macroscopic identical objects, since these materials function both as an essential component of industrial processes as well as a model system for many non-equilibrium states of matter. We show that by placing granular materials in the context of design, benefits emerge simultaneously for fundamental and applied interests. As first steps, we use our framework to design granular aggregates with extreme properties like high stiffness, and softness. We demonstrate control over nonlinear effects by producing exotic aggregates that stiffen under compression. Expanding on our framework, we conceptualize new ways of thinking about material design when automatic discovery is possible. We show how to build rules that link particle shapes to arbitrary granular packing

  8. Automated database design for large-scale scientific applications

    NASA Astrophysics Data System (ADS)

    Papadomanolakis, Stratos

    The need for large-scale scientific data management is today more pressing than ever, as modern sciences need to store and process terabyte-scale data volumes. Traditional systems, relying on filesystems and custom data access and processing code do not scale for multi-terabyte datasets. Therefore, supporting today's data-driven sciences requires the development of new data management capabilities. This Ph.D dissertation develops techniques that allow modern Database Management Systems (DBMS) to efficiently handle large scientific datasets. Several recent successful DBMS deployments target applications like astronomy, that manage collections of objects or observations (e.g. galaxies, spectra) and can easily store their data in a commercial relational DBMS. Query performance for such systems critically depends on the database physical design, the organization of database structures such as indexes and tables. This dissertation develops algorithms and tools for automating the physical design process. Our tools allow databases to tune themselves, providing efficient query execution in the presence of large data volumes and complex query workloads. For more complex applications dealing with multidimensional and time-varying data, standard relational DBMS are inadequate. Efficiently supporting such applications requires the development of novel indexing and query processing techniques. This dissertation develops an indexing technique for unstructured tetrahedral meshes, a multidimensional data organization used in finite element analysis applications. Our technique outperforms existing multidimensional indexing techniques and has the advantage that can easily be integrated with standard DBMS, providing existing systems with the ability to handle spatial data with minor modifications.

  9. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  10. Survey of adaptive control using Liapunov design

    NASA Technical Reports Server (NTRS)

    Lindorff, D. P.; Carroll, R. L.

    1973-01-01

    A survey of the literature in which Liapunov's second method is used in determining the control law is presented, with emphasis placed on the model-tracking adaptive control problem. Forty references are listed. Following a brief tutorial exposition of the adaptive control problem, the techniques for treating reduction of order, disturbance and time-varying parameters, multivariable systems, identification, and adaptive observers are discussed. The method is critically evaluated, particularly with respect to possibilities for application.

  11. Adaptive time-lapse optimized survey design for electrical resistivity tomography monitoring

    NASA Astrophysics Data System (ADS)

    Wilkinson, Paul B.; Uhlemann, Sebastian; Meldrum, Philip I.; Chambers, Jonathan E.; Carrière, Simon; Oxby, Lucy S.; Loke, M. H.

    2015-10-01

    Adaptive optimal experimental design methods use previous data and results to guide the choice and design of future experiments. This paper describes the formulation of an adaptive survey design technique to produce optimal resistivity imaging surveys for time-lapse geoelectrical monitoring experiments. These survey designs are time-dependent and, compared to dipole-dipole or static optimized surveys that do not change over time, focus a greater degree of the image resolution on regions of the subsurface that are actively changing. The adaptive optimization method is validated using a controlled laboratory monitoring experiment comprising a well-defined cylindrical target moving along a trajectory that changes its depth and lateral position. The algorithm is implemented on a standard PC in conjunction with a modified automated multichannel resistivity imaging system. Data acquisition using the adaptive survey designs requires no more time or power than with comparable standard surveys, and the algorithm processing takes place while the system batteries recharge. The results show that adaptively designed optimal surveys yield a quantitative increase in image quality over and above that produced by using standard dipole-dipole or static (time-independent) optimized surveys.

  12. Fully automated and adaptive detection of amyloid plaques in stained brain sections of Alzheimer transgenic mice.

    PubMed

    Feki, Abdelmonem; Teboul, Olivier; Dubois, Albertine; Bozon, Bruno; Faure, Alexis; Hantraye, Philippe; Dhenain, Marc; Delatour, Benoit; Delzescaux, Thierry

    2007-01-01

    Automated detection of amyloid plaques (AP) in post mortem brain sections of patients with Alzheimer disease (AD) or in mouse models of the disease is a major issue to improve quantitative, standardized and accurate assessment of neuropathological lesions as well as of their modulation by treatment. We propose a new segmentation method to automatically detect amyloid plaques in Congo Red stained sections based on adaptive thresholds and a dedicated amyloid plaque/tissue modelling. A set of histological sections focusing on anatomical structures was used to validate the method in comparison to expert segmentation. Original information concerning global amyloid load have been derived from 6 mouse brains which opens new perspectives for the extensive analysis of such a data in 3-D and the possibility to integrate in vivo-post mortem information for diagnosis purposes. PMID:18044661

  13. Designing and Generating Educational Adaptive Hypermedia Applications

    ERIC Educational Resources Information Center

    Retalis, Symeon; Papasalouros, Andreas

    2005-01-01

    Educational Adaptive Hypermedia Applications (EAHA) provide personalized views on the learning content to individual learners. They also offer adaptive sequencing (navigation) over the learning content based on rules that stem from the user model requirements and the instructional strategies. EAHA are gaining the focus of the research community as…

  14. Automated Design of Noise-Minimal, Safe Rotorcraft Trajectories

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Venable, K. Brent; Lindsay, James

    2012-01-01

    NASA and the international community are investing in the development of a commercial transportation infrastructure that includes the increased use of rotorcraft, specifically helicopters and aircraft such as a 40-passenger civil tilt rotors. Rotorcraft have a number of advantages over fixed wing aircraft, primarily in not requiring direct access to the primary fixed wing runways. As such they can operate at an airport without directly interfering with major air carrier and commuter aircraft operations. However, there is significant concern over the impact of noise on the communities surrounding the transportation facilities. In this paper we propose to address the rotorcraft noise problem by exploiting powerful search techniques coming from artificial intelligence, coupled with simulation and field tests, to design trajectories that are expected to improve on the amount of ground noise generated. This paper investigates the use of simulation based on predictive physical models to facilitate the search for low-noise trajectories using a class of automated search algorithms called local search. A novel feature of this approach is the ability to incorporate constraints into the problem formulation that addresses passenger safety and comfort.

  15. Design and implementation of an automated compound management system in support of lead optimization.

    PubMed

    Quintero, Catherine; Kariv, Ilona

    2009-06-01

    To meet the needs of the increasingly rapid and parallelized lead optimization process, a fully integrated local compound storage and liquid handling system was designed and implemented to automate the generation of assay-ready plates directly from newly submitted and cherry-picked compounds. A key feature of the system is the ability to create project- or assay-specific compound-handling methods, which provide flexibility for any combination of plate types, layouts, and plate bar-codes. Project-specific workflows can be created by linking methods for processing new and cherry-picked compounds and control additions to produce a complete compound set for both biological testing and local storage in one uninterrupted workflow. A flexible cherry-pick approach allows for multiple, user-defined strategies to select the most appropriate replicate of a compound for retesting. Examples of custom selection parameters include available volume, compound batch, and number of freeze/thaw cycles. This adaptable and integrated combination of software and hardware provides a basis for reducing cycle time, fully automating compound processing, and ultimately increasing the rate at which accurate, biologically relevant results can be produced for compounds of interest in the lead optimization process. PMID:19487770

  16. Adaptation to an automated platform of algorithmic combinations of advantageous mutations in genes generated using amino acid scanning mutational strategy.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recent mutational strategies for generating and screening of genes for optimized traits, including directed evolution, domain shuffling, random mutagenesis, and site-directed mutagenesis, have been adapted for automated platforms. Here we discuss the amino acid scanning mutational strategy and its ...

  17. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  18. DESIGN HANDBOOK FOR AUTOMATION OF ACTIVATED SLUDGE WASTEWATER TREATMENT PLANTS

    EPA Science Inventory

    This report is a systems engineering handbook for the automation of activated sludge wastewater treatment processes. Process control theory and application are discussed to acquaint the reader with terminology and fundamentals. Successful unit process control strategies currently...

  19. Robust design of configurations and parameters of adaptable products

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Chen, Yongliang; Xue, Deyi; Gu, Peihua

    2014-03-01

    An adaptable product can satisfy different customer requirements by changing its configuration and parameter values during the operation stage. Design of adaptable products aims at reducing the environment impact through replacement of multiple different products with single adaptable ones. Due to the complex architecture, multiple functional requirements, and changes of product configurations and parameter values in operation, impact of uncertainties to the functional performance measures needs to be considered in design of adaptable products. In this paper, a robust design approach is introduced to identify the optimal design configuration and parameters of an adaptable product whose functional performance measures are the least sensitive to uncertainties. An adaptable product in this paper is modeled by both configurations and parameters. At the configuration level, methods to model different product configuration candidates in design and different product configuration states in operation to satisfy design requirements are introduced. At the parameter level, four types of product/operating parameters and relations among these parameters are discussed. A two-level optimization approach is developed to identify the optimal design configuration and its parameter values of the adaptable product. A case study is implemented to illustrate the effectiveness of the newly developed robust adaptable design method.

  20. Fast Model Adaptation for Automated Section Classification in Electronic Medical Records.

    PubMed

    Ni, Jian; Delaney, Brian; Florian, Radu

    2015-01-01

    Medical information extraction is the automatic extraction of structured information from electronic medical records, where such information can be used for improving healthcare processes and medical decision making. In this paper, we study one important medical information extraction task called section classification. The objective of section classification is to automatically identify sections in a medical document and classify them into one of the pre-defined section types. Training section classification models typically requires large amounts of human labeled training data to achieve high accuracy. Annotating institution-specific data, however, can be both expensive and time-consuming; which poses a big hurdle for adapting a section classification model to new medical institutions. In this paper, we apply two advanced machine learning techniques, active learning and distant supervision, to reduce annotation cost and achieve fast model adaptation for automated section classification in electronic medical records. Our experiment results show that active learning reduces the annotation cost and time by more than 50%, and distant supervision can achieve good model accuracy using weakly labeled training data only. PMID:26262005

  1. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  2. Application of Adaptive Autopilot Designs for an Unmanned Aerial Vehicle

    NASA Technical Reports Server (NTRS)

    Shin, Yoonghyun; Calise, Anthony J.; Motter, Mark A.

    2005-01-01

    This paper summarizes the application of two adaptive approaches to autopilot design, and presents an evaluation and comparison of the two approaches in simulation for an unmanned aerial vehicle. One approach employs two-stage dynamic inversion and the other employs feedback dynamic inversions based on a command augmentation system. Both are augmented with neural network based adaptive elements. The approaches permit adaptation to both parametric uncertainty and unmodeled dynamics, and incorporate a method that permits adaptation during periods of control saturation. Simulation results for an FQM-117B radio controlled miniature aerial vehicle are presented to illustrate the performance of the neural network based adaptation.

  3. Design and Implementation of an Open, Interoperable AutomatedDemand Response Infrastructure

    SciTech Connect

    Piette, Mary Ann; Kiliccote, Sila; Ghatikar, Girish

    2007-10-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automating demand response (DR). Automating DR allows greater levels of participation and improved reliability and repeatability of the demand response and customer facilities. Automated DR systems have been deployed for critical peak pricing and demand bidding and are being designed for real time pricing. The system is designed to generate, manage, and track DR signals between utilities and Independent System Operators (ISOs) to aggregators and end-use customers and their control systems.

  4. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  5. Micro-force compensation in automated micro-object positioning using adaptive neural networks

    NASA Astrophysics Data System (ADS)

    Shahini, M.; Melek, W. W.; Yeow, J. T. W.

    2009-09-01

    This paper proposes a novel approach for controlled pushing of a micro-sized object along a desired path. Challenges associated with this control task due to the presence of dominating micro-forces are carefully studied and a solution based on the application of artificial neural networks is introduced. A nonlinear controller is proposed for controlled pushing of micro-objects which guarantees the stability of the closed-loop system in the Lyapunov sense. An experimental setup is designed to validate the performance of the proposed controller. Results suggest that artificial neural networks present a promising tool for design of adaptive controllers to accurately manipulate objects in the microscopic scale.

  6. Multidimensional Adaptive Testing with Optimal Design Criteria for Item Selection

    ERIC Educational Resources Information Center

    Mulder, Joris; van der Linden, Wim J.

    2009-01-01

    Several criteria from the optimal design literature are examined for use with item selection in multidimensional adaptive testing. In particular, it is examined what criteria are appropriate for adaptive testing in which all abilities are intentional, some should be considered as a nuisance, or the interest is in the testing of a composite of the…

  7. Situation Awareness Implications of Adaptive Automation of Air Traffic Controller Information Processing Functions

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa

    2004-01-01

    The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the

  8. A framework for automated contour quality assurance in radiation therapy including adaptive techniques

    NASA Astrophysics Data System (ADS)

    Altman, M. B.; Kavanaugh, J. A.; Wooten, H. O.; Green, O. L.; DeWees, T. A.; Gay, H.; Thorstad, W. L.; Li, H.; Mutic, S.

    2015-07-01

    Contouring of targets and normal tissues is one of the largest sources of variability in radiation therapy treatment plans. Contours thus require a time intensive and error-prone quality assurance (QA) evaluation, limitations which also impair the facilitation of adaptive radiotherapy (ART). Here, an automated system for contour QA is developed using historical data (the ‘knowledge base’). A pilot study was performed with a knowledge base derived from 9 contours each from 29 head-and-neck treatment plans. Size, shape, relative position, and other clinically-relevant metrics and heuristically derived rules are determined. Metrics are extracted from input patient data and compared against rules determined from the knowledge base; a computer-learning component allows metrics to evolve with more input data, including patient specific data for ART. Nine additional plans containing 42 unique contouring errors were analyzed. 40/42 errors were detected as were 9 false positives. The results of this study imply knowledge-based contour QA could potentially enhance the safety and effectiveness of RT treatment plans as well as increase the efficiency of the treatment planning process, reducing labor and the cost of therapy for patients.

  9. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    SciTech Connect

    Adal, Kedir M.; Sidebe, Desire; Ali, Sharib; Chaum, Edward; Karnowski, Thomas Paul; Meriaudeau, Fabrice

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  10. An Adaptive Web-Based Support to e-Education in Robotics and Automation

    NASA Astrophysics Data System (ADS)

    di Giamberardino, Paolo; Temperini, Marco

    The paper presents the hardware and software architecture of a remote laboratory, with robotics and automation applications, devised to support e-teaching and e-learning activities, at an undergraduate level in computer engineering. The hardware is composed by modular structures, based on the Lego Mindstorms components: they are reasonably sophisticated in terms of functions, pretty easy to use, and sufficiently affordable in terms of cost. Moreover, being the robots intrinsically modular, wrt the number and distribution of sensors and actuators, they are easily and quickly reconfigurable. A web application makes the laboratory and its robots available via internet. The software framework allows the teacher to define, for the course under her/his responsibility, a learning path made of different and differently complex exercises, graduated in terms of the "difficulty" they require to meet and of the "competence" that the solver is supposed to have shown. The learning path of exercises is adapted to the individual learner's progressively growing competence: at any moment, only a subset of the exercises is available (depending on how close their levels of competence and difficulty are to those of the exercises already solved by the learner).

  11. Mental workload dynamics in adaptive interface design

    NASA Technical Reports Server (NTRS)

    Hancock, Peter A.; Chignell, Mark H.

    1988-01-01

    In examining the role of time in mental workload, the authors present a different perspective from which to view the problem of assessment. Mental workload is plotted in three dimensions, whose axes represent effective time for action, perceived distance from desired goal state, level of effort required to achieve the time-constrained goal. This representation allows the generation of isodynamic workload contours that incorporate the factors of operator skill and equifinality of effort. An adaptive interface for dynamic task reallocation is described that uses this form of assessment to reconcile the joint aims of stable operator loading and acceptable primary task performance by the total system.

  12. 28. 'TOWER DESIGN NO. 11, ADAPTED FROM NO. 9,' drawn ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. 'TOWER DESIGN NO. 11, ADAPTED FROM NO. 9,' drawn by project architect Alfred Eichler, undated, ca. 1934. - Sacramento River Bridge, Spanning Sacramento River at California State Highway 275, Sacramento, Sacramento County, CA

  13. Robotic design for an automated uranium solution enrichment system

    SciTech Connect

    Horley, E.C.; Beugelsdijk, T.; Biddle, R.S.; Bronisz, L.E.; Hansen, W.J.; Li, T.K.; Sampson, T.E.; Walton, G.

    1990-01-01

    A method to automate solution enrichment analysis by gamma-ray spectroscopy is being developed at Los Alamos National Laboratory. Both passive and x-ray fluorescence (XRF) analyses will be remotely performed to determine the amounts of {sup 235}U and total uranium in sample containers. A commercial laboratory robot will be used to process up to 40 batch and 8 priority samples in an unattended mode. Samples will be read by a bar-code reader to determine measurement requirements, then assayed by either or both of the gamma-ray and XRF instruments. The robot will be responsible for moving the sample containers and operating all shield doors and shutters. In addition to reducing hardware complexity, this feature will also allow manual operation of the instruments if the robot fails. This automated system will reduce personnel radiation exposure and increase the reliability and repeatability of the measurements.

  14. Development of automated power system management techniques. [spacecraft design

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Skelly, L. A.; Weiner, H.

    1978-01-01

    The basic approach in the automated power system management (APSM) implementation is to use one central microprocessor for the overall power system supervision and several local microprocessors dedicated to one or more major subassemblies to perform simple monitoring and control functions. Communication between the central and each local processor is through a dedicated two-wire network employing serial data transfer. The block diagrams of the processors, the data bus characteristics, and the software functions and organization are presented.

  15. Design of Pel Adaptive DPCM coding based upon image partition

    NASA Astrophysics Data System (ADS)

    Saitoh, T.; Harashima, H.; Miyakawa, H.

    1982-01-01

    A Pel Adaptive DPCM coding system based on image partition is developed which possesses coding characteristics superior to those of the Block Adaptive DPCM coding system. This method uses multiple DPCM coding loops and nonhierarchical cluster analysis. It is found that the coding performances of the Pel Adaptive DPCM coding method differ depending on the subject images. The Pel Adaptive DPCM designed using these methods is shown to yield a maximum performance advantage of 2.9 dB for the Girl and Couple images and 1.5 dB for the Aerial image, although no advantage was obtained for the moon image. These results show an improvement over the optimally designed Block Adaptive DPCM coding method proposed by Saito et al. (1981).

  16. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  17. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  18. The design of an automated verification of redundant systems

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1972-01-01

    Handbook describes design processes, presents design considerations and techniques, gives tutorial material on implementation and methodology, shows design aids, illustrates use of design aids and application samples, and identifies general practices to be adhered to or avoided.

  19. Dynamics of adaptive structures: Design through simulations

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alexander, S.

    1993-01-01

    The use of a helical bi-morph actuator/sensor concept by mimicking the change of helical waveform in bacterial flagella is perhaps the first application of bacterial motions (living species) to longitudinal deployment of space structures. However, no dynamical considerations were analyzed to explain the waveform change mechanisms. The objective is to review various deployment concepts from the dynamics point of view and introduce the dynamical considerations from the outset as part of design considerations. Specifically, the impact of the incorporation of the combined static mechanisms and dynamic design considerations on the deployment performance during the reconfiguration stage is studied in terms of improved controllability, maneuvering duration, and joint singularity index. It is shown that intermediate configurations during articulations play an important role for improved joint mechanisms design and overall structural deployability.

  20. Mission Design Evaluation Using Automated Planning for High Resolution Imaging of Dynamic Surface Processes from the ISS

    NASA Technical Reports Server (NTRS)

    Knight, Russell; Donnellan, Andrea; Green, Joseph J.

    2013-01-01

    A challenge for any proposed mission is to demonstrate convincingly that the proposed systems will in fact deliver the science promised. Funding agencies and mission design personnel are becoming ever more skeptical of the abstractions that form the basis of the current state of the practice with respect to approximating science return. To address this, we have been using automated planning and scheduling technology to provide actual coverage campaigns that provide better predictive performance with respect to science return for a given mission design and set of mission objectives given implementation uncertainties. Specifically, we have applied an adaptation of ASPEN and SPICE to the Eagle-Eye domain that demonstrates the performance of the mission design with respect to coverage of science imaging targets that address climate change and disaster response. Eagle-Eye is an Earth-imaging telescope that has been proposed to fly aboard the International Space Station (ISS).

  1. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  2. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  3. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  4. Designing Microcomputer Networks (and) LANS: A New Technology to Improve Library Automation.

    ERIC Educational Resources Information Center

    Ivie, Evan L.; Farr, Rick C.

    1984-01-01

    Two articles address the design of microcomputer networks and the use of local area computer networks (LAN) to improve library automation. Topics discussed include network design criteria, media for local networks, transmission mode, typical communication protocols, user interface, basic local network architectures, and examples of microcomputer…

  5. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  6. Design of Low Complexity Model Reference Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Johnson, Marcus; Nguyen, Nhan

    2012-01-01

    Flight research experiments have demonstrated that adaptive flight controls can be an effective technology for improving aircraft safety in the event of failures or damage. However, the nonlinear, timevarying nature of adaptive algorithms continues to challenge traditional methods for the verification and validation testing of safety-critical flight control systems. Increasingly complex adaptive control theories and designs are emerging, but only make testing challenges more difficult. A potential first step toward the acceptance of adaptive flight controllers by aircraft manufacturers, operators, and certification authorities is a very simple design that operates as an augmentation to a non-adaptive baseline controller. Three such controllers were developed as part of a National Aeronautics and Space Administration flight research experiment to determine the appropriate level of complexity required to restore acceptable handling qualities to an aircraft that has suffered failures or damage. The controllers consist of the same basic design, but incorporate incrementally-increasing levels of complexity. Derivations of the controllers and their adaptive parameter update laws are presented along with details of the controllers implementations.

  7. The Potential of Adaptive Design in Animal Studies.

    PubMed

    Majid, Arshad; Bae, Ok-Nam; Redgrave, Jessica; Teare, Dawn; Ali, Ali; Zemke, Daniel

    2015-01-01

    Clinical trials are the backbone of medical research, and are often the last step in the development of new therapies for use in patients. Prior to human testing, however, preclinical studies using animal subjects are usually performed in order to provide initial data on the safety and effectiveness of prospective treatments. These studies can be costly and time consuming, and may also raise concerns about the ethical treatment of animals when potentially harmful procedures are involved. Adaptive design is a process by which the methods used in a study may be altered while it is being conducted in response to preliminary data or other new information. Adaptive design has been shown to be useful in reducing the time and costs associated with clinical trials, and may provide similar benefits in preclinical animal studies. The purpose of this review is to summarize various aspects of adaptive design and evaluate its potential for use in preclinical research. PMID:26473839

  8. The Potential of Adaptive Design in Animal Studies

    PubMed Central

    Majid, Arshad; Bae, Ok-Nam; Redgrave, Jessica; Teare, Dawn; Ali, Ali; Zemke, Daniel

    2015-01-01

    Clinical trials are the backbone of medical research, and are often the last step in the development of new therapies for use in patients. Prior to human testing, however, preclinical studies using animal subjects are usually performed in order to provide initial data on the safety and effectiveness of prospective treatments. These studies can be costly and time consuming, and may also raise concerns about the ethical treatment of animals when potentially harmful procedures are involved. Adaptive design is a process by which the methods used in a study may be altered while it is being conducted in response to preliminary data or other new information. Adaptive design has been shown to be useful in reducing the time and costs associated with clinical trials, and may provide similar benefits in preclinical animal studies. The purpose of this review is to summarize various aspects of adaptive design and evaluate its potential for use in preclinical research. PMID:26473839

  9. Automation and Accountability in Decision Support System Interface Design

    ERIC Educational Resources Information Center

    Cummings, Mary L.

    2006-01-01

    When the human element is introduced into decision support system design, entirely new layers of social and ethical issues emerge but are not always recognized as such. This paper discusses those ethical and social impact issues specific to decision support systems and highlights areas that interface designers should consider during design with an…

  10. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  11. Application of the Modular Automated Reconfigurable Assembly System (MARAS) concept to adaptable vision gauging and parts feeding

    NASA Technical Reports Server (NTRS)

    By, Andre Bernard; Caron, Ken; Rothenberg, Michael; Sales, Vic

    1994-01-01

    This paper presents the first phase results of a collaborative effort between university researchers and a flexible assembly systems integrator to implement a comprehensive modular approach to flexible assembly automation. This approach, named MARAS (Modular Automated Reconfigurable Assembly System), has been structured to support multiple levels of modularity in terms of both physical components and system control functions. The initial focus of the MARAS development has been on parts gauging and feeding operations for cylinder lock assembly. This phase is nearing completion and has resulted in the development of a highly configurable system for vision gauging functions on a wide range of small components (2 mm to 100 mm in size). The reconfigurable concepts implemented in this adaptive Vision Gauging Module (VGM) are now being extended to applicable aspects of the singulating, selecting, and orienting functions required for the flexible feeding of similar mechanical components and assemblies.

  12. The design of digital-adaptive controllers for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.; Broussard, J. R.; Berry, P. W.

    1976-01-01

    Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.

  13. CAMERA: a compact, automated, laser adaptive optics system for small aperture telescopes

    NASA Astrophysics Data System (ADS)

    Britton, Matthew; Velur, Viswa; Law, Nick; Choi, Philip; Penprase, Bryan E.

    2008-07-01

    CAMERA is an autonomous laser guide star adaptive optics system designed for small aperture telescopes. This system is intended to be mounted permanently on such a telescope to provide large amounts of flexibly scheduled observing time, delivering high angular resolution imagery in the visible and near infrared. The design employs a Shack Hartmann wavefront sensor, a 12x12 actuator MEMS device for high order wavefront compensation, and a solid state 355nm ND:YAG laser to generate a guide star. Commercial CCD and InGaAs detectors provide coverage in the visible and near infrared. CAMERA operates by selecting targets from a queue populated by users and executing these observations autonomously. This robotic system is targeted towards applications that are diffcult to address using classical observing strategies: surveys of very large target lists, recurrently scheduled observations, and rapid response followup of transient objects. This system has been designed and costed, and a lab testbed has been developed to evaluate key components and validate autonomous operations.

  14. Precision of maximum likelihood estimation in adaptive designs.

    PubMed

    Graf, Alexandra Christine; Gutjahr, Georg; Brannath, Werner

    2016-03-15

    There has been increasing interest in trials that allow for design adaptations like sample size reassessment or treatment selection at an interim analysis. Ignoring the adaptive and multiplicity issues in such designs leads to an inflation of the type 1 error rate, and treatment effect estimates based on the maximum likelihood principle become biased. Whereas the methodological issues concerning hypothesis testing are well understood, it is not clear how to deal with parameter estimation in designs were adaptation rules are not fixed in advanced so that, in practice, the maximum likelihood estimate (MLE) is used. It is therefore important to understand the behavior of the MLE in such designs. The investigation of Bias and mean squared error (MSE) is complicated by the fact that the adaptation rules need not be fully specified in advance and, hence, are usually unknown. To investigate Bias and MSE under such circumstances, we search for the sample size reassessment and selection rules that lead to the maximum Bias or maximum MSE. Generally, this leads to an overestimation of Bias and MSE, which can be reduced by imposing realistic constraints on the rules like, for example, a maximum sample size. We consider designs that start with k treatment groups and a common control and where selection of a single treatment and control is performed at the interim analysis with the possibility to reassess each of the sample sizes. We consider the case of unlimited sample size reassessments as well as several realistically restricted sample size reassessment rules. PMID:26459506

  15. Automating software design and configuration for a small spacecraft

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    The Open Prototype for Educational NanoSats (OPEN) is a framework for the development of low-cost spacecraft. It will allow users to build a 1-U (10 cm x 10 cm x 11 cm, 1.33 kg) CubeSat-class spacecraft with a parts budget of approximately $5,000. Work is underway to develop software to assist users in configuring the spacecraft and validating its compliance with integration and launch standards. Each prospective configuration requires a unique software configuration, combining pre-built modules for controlling base components, custom control software for custom developed and payload components and overall mission management and control software (which, itself will be a combination of standard components and mission specific control logic). This paper presents a system for automating standard component configuration and creating templates to facilitate the creation and integration of components that must be (or which the developer desires to be) custom-developed for the particular mission or spacecraft.

  16. Design of microcontroller based system for automation of streak camera

    SciTech Connect

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  17. Design of microcontroller based system for automation of streak camera

    NASA Astrophysics Data System (ADS)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  18. Missile guidance law design using adaptive cerebellar model articulation controller.

    PubMed

    Lin, Chih-Min; Peng, Ya-Fu

    2005-05-01

    An adaptive cerebellar model articulation controller (CMAC) is proposed for command to line-of-sight (CLOS) missile guidance law design. In this design, the three-dimensional (3-D) CLOS guidance problem is formulated as a tracking problem of a time-varying nonlinear system. The adaptive CMAC control system is comprised of a CMAC and a compensation controller. The CMAC control is used to imitate a feedback linearization control law and the compensation controller is utilized to compensate the difference between the feedback linearization control law and the CMAC control. The online adaptive law is derived based on the Lyapunov stability theorem to learn the weights of receptive-field basis functions in CMAC control. In addition, in order to relax the requirement of approximation error bound, an estimation law is derived to estimate the error bound. Then the adaptive CMAC control system is designed to achieve satisfactory tracking performance. Simulation results for different engagement scenarios illustrate the validity of the proposed adaptive CMAC-based guidance law. PMID:15940993

  19. Using IDE in Instructional Design: Encouraging Reflective Instruction Design Through Automated Design Tools.

    ERIC Educational Resources Information Center

    Russell, Daniel M.; Kelley, Loretta

    The Instructional Design Environment (IDE), a computer assisted instruction tool for instructional design, has been incorporated into the curriculum and instructional development in mathematics instruction in the Stanford Teacher Education Program (STEP). (STEP is a 12-month program leading to an M.A. in education which emphasizes content focus…

  20. Automated design of a uniform distribution using faceted reflectors

    NASA Astrophysics Data System (ADS)

    Cassarly, William J.; David, Stuart R.; Jenkins, David G.; Riser, Andrew; Davenport, Thomas L.

    2000-07-01

    Faceted reflectors are a ubiquitous means for providing uniform illumination in many commercial lighting products, examples being newer flashlights, department-store display lighting, and the faceted reflectors found in overhead projectors. However, the design of faceted reflectors using software has often been more limited by the tools available to design them than by the imagination of the designers. One of the keys to enabling a broader range of design options has been to allow more complex surfaces using constructive solid geometry (CSG). CSG uses Boolean operations on basic geometric primitives to define shapes to create individual facets. In this paper, we describe an improved faceted reflector design algorithm and use it to create a wide range of CSG-based reflectors. The performance of various reflectors is compared using a Monte Carlo ray-trace method.

  1. Experimental Design to Evaluate Directed Adaptive Mutation in Mammalian Cells

    PubMed Central

    Chiaro, Christopher R; May, Tobias

    2014-01-01

    Background We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Objective Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. Methods An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. Results We performed the initial stages of characterizing our system

  2. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  3. One window on the world of design automation

    SciTech Connect

    Stamm, D.A.

    1983-01-01

    A discussion of these factors is used as an introduction to a description of the daisy logician workstation. Rapidly increasing VLSI design complexity is most often cited as the primary reason for the emergence of the engineering workstation. Many other factors play an even more crucial role, however, in the development of this new technology. These include the advent of custom devices, the standardisation of logic design methodology and advances in memory density, display technology and disc technology.

  4. Automated design of the surface positions of protein helices.

    PubMed Central

    Dahiyat, B. I.; Gordon, D. B.; Mayo, S. L.

    1997-01-01

    Using a protein design algorithm that quantitatively considers side-chain interactions, the design of surface residues of alpha helices was examined. Three scoring functions were tested: a hydrogen-bond potential, a hydrogen-bond potential in conjunction with a penalty for uncompensated burial of polar hydrogens, and a hydrogen-bond potential in combination with helix propensity. The solvent exposed residues of a homodimeric coiled coil based on GCN4-p1 were designed by using the Dead-End Elimination Theorem to find the optimal amino acid sequence for each scoring function. The corresponding peptides were synthesized and characterized by circular dichroism spectroscopy and size exclusion chromatography. The designed peptides were dimeric and nearly 100% helical at 1 degree C, with melting temperatures from 69-72 degrees C, over 12 degrees C higher than GCN4-p1, whereas a random hydrophilic sequence at the surface positions produced a peptide that melted at 15 degrees C. Analysis of the designed sequences suggests that helix propensity is the key factor in sequence design for surface helical positions. PMID:9194194

  5. Interactive spatial tools for the design of regional adaptation strategies.

    PubMed

    Eikelboom, T; Janssen, R

    2013-09-01

    Regional adaptation strategies are plans that consist of feasible measures to shift a region towards a system that is flexible and robust for future climate changes. They apply to regional impacts of climate change and are imbedded in broader planning. Multiple adaptation frameworks and guidelines exist that describe the development stages of regional adaptation strategies. Spatial information plays a key role in the design of adaptation measures as both the effects of climate change as well as many adaptation measures have spatial impacts. Interactive spatial support tools such as drawing, simulation and evaluation tools can assist the development process. This paper presents how to connect tasks derived from the actual development stages to spatial support tools in an interactive multi-stakeholder context. This link helps to decide what spatial tools are suited to support which stages in the development process of regional adaptation strategies. The practical implication of the link is illustrated for three case study workshops in the Netherlands. The regional planning workshops combine expertise from both scientists and stakeholders with an interactive mapping device. This approach triggered participants to share their expertise and stimulated integration of knowledge. PMID:23137917

  6. Automation and Schema Acquisition in Learning Elementary Computer Programming: Implications for the Design of Practice.

    ERIC Educational Resources Information Center

    Van Merrienboer, Jeroen J. G.; Paas, Fred G. W. C.

    1990-01-01

    Discussion of computer programing at the secondary level focuses on automation and schema acquisition as two processes important in learning cognitive skills such as programing. Their effects on learning outcomes and transfer of training are examined, the importance of worked examples is highlighted, and instructional design principles are…

  7. Design and development of a semi-automated module for the preparation of metallic PET radionuclides

    NASA Astrophysics Data System (ADS)

    Trejo-Ballado, F.; Lopez-Rodriguez, V.; Gaspar-Carcamo, R. E.; Hurtado-Chong, G.; Avila-Rodriguez, Miguel A.

    2012-12-01

    The method for the production of metallic radionuclides has been widely reported, and most of them share a common ion chromatography purification technique. The aim of this work is to design and develop a semi-automated remotely controlled module for the purification of metallic PET radionuclides via cation exchange chromatography.

  8. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  9. Design, development, test, and evaluation of an automated analytical electrophoresis apparatus

    NASA Technical Reports Server (NTRS)

    Bartels, P. A.; Bier, M.

    1977-01-01

    An Automated Analytical Electrophoresis Apparatus (AAEA) was designed, developed, assembled, and preliminarily tested. The AAEA was demonstrated to be a feasible apparatus for automatically acquiring, displaying, and storing (and eventually analyzing) electrophoresis mobility data from living blood cells. The apparatus and the operation of its major assemblies are described in detail.

  10. Graphic design principles for automated document segmentation and understanding

    NASA Astrophysics Data System (ADS)

    Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.

    2006-01-01

    When designers develop a document layout their objective is to convey a specific message and provoke a specific response from the audience. Design principles provide the foundation for identifying document components and relations among them to extract implicit knowledge from the layout. Variable Data Printing enables the production of personalized printing jobs for which traditional proofing of all the job instances could result unfeasible. This paper explains a rule-based system that uses design principles to segment and understand document context. The system uses the design principles of repetition, proximity, alignment, similarity, and contrast as the foundation for the strategy in document segmentation and understanding which holds a strong relation with the recognition of artifacts produced by the infringement of the constraints articulated in the document layout. There are two main modules in the tool: the geometric analysis module; and the design rule engine. The geometric analysis module extracts explicit knowledge from the data provided in the document. The design rule module uses the information provided by the geometric analysis to establish logical units inside the document. We used a subset of XSL-FO, sufficient for designing documents with an adequate amount complexity. The system identifies components such as headers, paragraphs, lists, images and determines the relations between them, such as header-paragraph, header-list, etc. The system provides accurate information about the geometric properties of the components, detects the elements of the documents and identifies corresponding components between a proofed instance and the rest of the instances in a Variable Data Printing Job.

  11. Frequency Adaptability and Waveform Design for OFDM Radar Space-Time Adaptive Processing

    SciTech Connect

    Sen, Satyabrata; Glover, Charles Wayne

    2012-01-01

    We propose an adaptive waveform design technique for an orthogonal frequency division multiplexing (OFDM) radar signal employing a space-time adaptive processing (STAP) technique. We observe that there are inherent variabilities of the target and interference responses in the frequency domain. Therefore, the use of an OFDM signal can not only increase the frequency diversity of our system, but also improve the target detectability by adaptively modifying the OFDM coefficients in order to exploit the frequency-variabilities of the scenario. First, we formulate a realistic OFDM-STAP measurement model considering the sparse nature of the target and interference spectra in the spatio-temporal domain. Then, we show that the optimal STAP-filter weight-vector is equal to the generalized eigenvector corresponding to the minimum generalized eigenvalue of the interference and target covariance matrices. With numerical examples we demonstrate that the resultant OFDM-STAP filter-weights are adaptable to the frequency-variabilities of the target and interference responses, in addition to the spatio-temporal variabilities. Hence, by better utilizing the frequency variabilities, we propose an adaptive OFDM-waveform design technique, and consequently gain a significant amount of STAP-performance improvement.

  12. Automation for pattern library creation and in-design optimization

    NASA Astrophysics Data System (ADS)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also

  13. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.

  14. Scar-less multi-part DNA assembly design automation

    DOEpatents

    Hillson, Nathan J.

    2016-06-07

    The present invention provides a method of a method of designing an implementation of a DNA assembly. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding flanking homology sequences to each of the DNA oligos. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding optimized overhang sequences to each of the DNA oligos.

  15. Automated Design and Optimization of Pebble-bed Reactor Cores

    SciTech Connect

    Hans D. Gougar; Abderrafi M. Ougouag; William K. Terry

    2010-07-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  16. Design preferences and cognitive styles: experimentation by automated website synthesis

    PubMed Central

    2012-01-01

    Background This article aims to demonstrate computational synthesis of Web-based experiments in undertaking experimentation on relationships among the participants' design preference, rationale, and cognitive test performance. The exemplified experiments were computationally synthesised, including the websites as materials, experiment protocols as methods, and cognitive tests as protocol modules. This work also exemplifies the use of a website synthesiser as an essential instrument enabling the participants to explore different possible designs, which were generated on the fly, before selection of preferred designs. Methods The participants were given interactive tree and table generators so that they could explore some different ways of presenting causality information in tables and trees as the visualisation formats. The participants gave their preference ratings for the available designs, as well as their rationale (criteria) for their design decisions. The participants were also asked to take four cognitive tests, which focus on the aspects of visualisation and analogy-making. The relationships among preference ratings, rationale, and the results of cognitive tests were analysed by conservative non-parametric statistics including Wilcoxon test, Krustal-Wallis test, and Kendall correlation. Results In the test, 41 of the total 64 participants preferred graphical (tree-form) to tabular presentation. Despite the popular preference for graphical presentation, the given tabular presentation was generally rated to be easier than graphical presentation to interpret, especially by those who were scored lower in the visualization and analogy-making tests. Conclusions This piece of evidence helps generate a hypothesis that design preferences are related to specific cognitive abilities. Without the use of computational synthesis, the experiment setup and scientific results would be impractical to obtain. PMID:22748000

  17. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  18. An information theoretic approach of designing sparse kernel adaptive filters.

    PubMed

    Liu, Weifeng; Park, Il; Principe, José C

    2009-12-01

    This paper discusses an information theoretic approach of designing sparse kernel adaptive filters. To determine useful data to be learned and remove redundant ones, a subjective information measure called surprise is introduced. Surprise captures the amount of information a datum contains which is transferable to a learning system. Based on this concept, we propose a systematic sparsification scheme, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters. Nonlinear regression, short term chaotic time-series prediction, and long term time-series forecasting examples are presented. PMID:19923047

  19. Frequency based design of modal controllers for adaptive optics systems.

    PubMed

    Agapito, Guido; Battistelli, Giorgio; Mari, Daniele; Selvi, Daniela; Tesi, Alberto; Tesi, Pietro

    2012-11-19

    This paper addresses the problem of reducing the effects of wavefront distortions in ground-based telescopes within a "Modal-Control" framework. The proposed approach allows the designer to optimize the Youla parameter of a given modal controller with respect to a relevant adaptive optics performance criterion defined on a "sampled" frequency domain. This feature makes it possible to use turbulence/vibration profiles of arbitrary complexity (even empirical power spectral densities from data), while keeping the controller order at a moderate value. Effectiveness of the proposed solution is also illustrated through an adaptive optics numerical simulator. PMID:23187567

  20. A Hierarchical Adaptive Approach to Optimal Experimental Design

    PubMed Central

    Kim, Woojae; Pitt, Mark A.; Lu, Zhong-Lin; Steyvers, Mark; Myung, Jay I.

    2014-01-01

    Experimentation is at the core of research in the behavioral and neural sciences, yet observations can be expensive and time-consuming to acquire (e.g., MRI scans, responses from infant participants). A major interest of researchers is designing experiments that lead to maximal accumulation of information about the phenomenon under study with the fewest possible number of observations. In addressing this challenge, statisticians have developed adaptive design optimization methods. This letter introduces a hierarchical Bayes extension of adaptive design optimization that provides a judicious way to exploit two complementary schemes of inference (with past and future data) to achieve even greater accuracy and efficiency in information gain. We demonstrate the method in a simulation experiment in the field of visual perception. PMID:25149697

  1. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  2. Using Adaptive Automation to Increase Operator Performance and Decrease Stress in a Satellite Operations Environment

    ERIC Educational Resources Information Center

    Klein, David C.

    2014-01-01

    As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…

  3. Design of an adaptive neural network based power system stabilizer.

    PubMed

    Liu, Wenxin; Venayagamoorthy, Ganesh K; Wunsch, Donald C

    2003-01-01

    Power system stabilizers (PSS) are used to generate supplementary control signals for the excitation system in order to damp the low frequency power system oscillations. To overcome the drawbacks of conventional PSS (CPSS), numerous techniques have been proposed in the literature. Based on the analysis of existing techniques, this paper presents an indirect adaptive neural network based power system stabilizer (IDNC) design. The proposed IDNC consists of a neuro-controller, which is used to generate a supplementary control signal to the excitation system, and a neuro-identifier, which is used to model the dynamics of the power system and to adapt the neuro-controller parameters. The proposed method has the features of a simple structure, adaptivity and fast response. The proposed IDNC is evaluated on a single machine infinite bus power system under different operating conditions and disturbances to demonstrate its effectiveness and robustness. PMID:12850048

  4. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research. PMID:19053496

  5. A Multi-Agent Design for Power Distribution Systems Automation

    NASA Astrophysics Data System (ADS)

    Ghorbani, M. Jawad

    A new Multi Agent System (MAS) design for fault location, isolation and restoration in power distribution systems is presented. In proposed approach, when there is a fault in the Power Distribution System (PDS), MAS quickly isolates the fault and restores the service to fault-free zones. Hierarchical coordination strategy is introduced to manage the agents which integrate the advantages of both centralized and decentralized coordination strategies. In this framework, Zone Agent (ZA) locate and isolate the fault based on the locally available information and assist the Feeder Agent (FA) for reconfiguration and restoration. FA can solve the restoration problem using the existing algorithms for the 0-1 Knapsack problem. A novel Q-learning mechanism is also introduced to support the FAs in decision making for restoration. Also a distributed MAS-Based Load Shedding (LS) technique has been used to supply as many of higher priority customers as possible, in case there is more demand than generation. The design is illustrated by the use of simulation case studies for fault location, isolation and restoration on West Virginia Super Circuit (WVSC) and hardware implementation for fault location and isolation in a laboratory platform. The results from the case studies indicate the performance of proposed MAS designs.

  6. Design of Adaptive Policy Pathways under Deep Uncertainties

    NASA Astrophysics Data System (ADS)

    Babovic, Vladan

    2013-04-01

    The design of large-scale engineering and infrastructural systems today is growing in complexity. Designers need to consider sociotechnical uncertainties, intricacies, and processes in the long- term strategic deployment and operations of these systems. In this context, water and spatial management is increasingly challenged not only by climate-associated changes such as sea level rise and increased spatio-temporal variability of precipitation, but also by pressures due to population growth and particularly accelerating rate of urbanisation. Furthermore, high investment costs and long term-nature of water-related infrastructure projects requires long-term planning perspective, sometimes extending over many decades. Adaptation to such changes is not only determined by what is known or anticipated at present, but also by what will be experienced and learned as the future unfolds, as well as by policy responses to social and water events. As a result, a pathway emerges. Instead of responding to 'surprises' and making decisions on ad hoc basis, exploring adaptation pathways into the future provide indispensable support in water management decision-making. In this contribution, a structured approach for designing a dynamic adaptive policy based on the concepts of adaptive policy making and adaptation pathways is introduced. Such an approach provides flexibility which allows change over time in response to how the future unfolds, what is learned about the system, and changes in societal preferences. The introduced flexibility provides means for dealing with complexities of adaptation under deep uncertainties. It enables engineering systems to change in the face of uncertainty to reduce impacts from downside scenarios while capitalizing on upside opportunities. This contribution presents comprehensive framework for development and deployment of adaptive policy pathway framework, and demonstrates its performance under deep uncertainties on a case study related to urban

  7. Design principles and algorithms for automated air traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    This paper presents design principles and algorithm for building a real time scheduler. The primary objective of the scheduler is to assign arrival aircraft to a favorable landing runway and schedule them to land at times that minimize delays. A further objective of the scheduler is to allocate delays between high altitude airspace far from the airport and low altitude airspace near the airport. A method of delay allocation is described that minimizes the average operating cost in the presence of errors in controlling aircraft to a specified landing time.

  8. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE.

    PubMed

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V; Turner, Helen C; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y Lawrence; Brenner, D J

    2009-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  9. The Chandra automated processing system: challenges, design enhancements, and lessons learned

    NASA Astrophysics Data System (ADS)

    Plummer, David; Grier, John; Masters, Sreelatha

    2006-06-01

    Chandra standard data processing involves hundreds of different types of data products and pipelines. Pipelines are initiated by different types of events or notifications and may depend upon many other pipelines for input data. The Chandra automated processing system (AP) was designed to handle the various notifications and orchestrate the pipeline processing. Certain data sets may require "special" handling that deviates slightly from the standard processing thread. Also, bulk reprocessing of data often involves new processing requirements. Most recently, a new type of processing to produce source catalogs has introduced requirements not anticipated by the original AP design. Managing these complex dependencies and evolving processing requirements in an efficient, flexible, and automated fashion presents many challenges. This paper describes the most significant of these challenges, the AP design changes required to address these issues and the lessons learned along the way.

  10. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE

    PubMed Central

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V.; Turner, Helen C.; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y. Lawrence; Brenner, D. J.

    2010-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  11. Self-Adaptive Stepsize Search Applied to Optimal Structural Design

    NASA Astrophysics Data System (ADS)

    Nolle, L.; Bland, J. A.

    Structural engineering often involves the design of space frames that are required to resist predefined external forces without exhibiting plastic deformation. The weight of the structure and hence the weight of its constituent members has to be as low as possible for economical reasons without violating any of the load constraints. Design spaces are usually vast and the computational costs for analyzing a single design are usually high. Therefore, not every possible design can be evaluated for real-world problems. In this work, a standard structural design problem, the 25-bar problem, has been solved using self-adaptive stepsize search (SASS), a relatively new search heuristic. This algorithm has only one control parameter and therefore overcomes the drawback of modern search heuristics, i.e. the need to first find a set of optimum control parameter settings for the problem at hand. In this work, SASS outperforms simulated-annealing, genetic algorithms, tabu search and ant colony optimization.

  12. Adaptive antenna design considerations for satellite communication antennas

    NASA Astrophysics Data System (ADS)

    Mayhan, J. T.

    1983-02-01

    The present investigation is concerned with some general considerations inherent in designing an adaptive antenna system for use on a geosynchronous satellite illuminating the earth field of view. The problem has been addressed from the viewpoint of the system designer who has to determine the required antenna characteristics and the antenna aperture size. Concerning the choice of the antenna type, it usually has to be decided whether to use a phased array (PA) or a multiple-beam antenna (MBA). Attention is given to nulling resolution and MBA/PA configuration, taking into account the phased array and multiple-beam antennas. The choice of which antenna type to use depends on the nulling bandwidth, the number of weighted channels in the adaptive processor, and the overall coverage area to be served by the antenna system.

  13. Adaptive Liver Stereotactic Body Radiation Therapy: Automated Daily Plan Reoptimization Prevents Dose Delivery Degradation Caused by Anatomy Deformations

    SciTech Connect

    Leinders, Suzanne M.; Breedveld, Sebastiaan; Méndez Romero, Alejandra; Schaart, Dennis; Seppenwoolde, Yvette; Heijmen, Ben J.M.

    2013-12-01

    Purpose: To investigate how dose distributions for liver stereotactic body radiation therapy (SBRT) can be improved by using automated, daily plan reoptimization to account for anatomy deformations, compared with setup corrections only. Methods and Materials: For 12 tumors, 3 strategies for dose delivery were simulated. In the first strategy, computed tomography scans made before each treatment fraction were used only for patient repositioning before dose delivery for correction of detected tumor setup errors. In adaptive second and third strategies, in addition to the isocenter shift, intensity modulated radiation therapy beam profiles were reoptimized or both intensity profiles and beam orientations were reoptimized, respectively. All optimizations were performed with a recently published algorithm for automated, multicriteria optimization of both beam profiles and beam angles. Results: In 6 of 12 cases, violations of organs at risk (ie, heart, stomach, kidney) constraints of 1 to 6 Gy in single fractions occurred in cases of tumor repositioning only. By using the adaptive strategies, these could be avoided (<1 Gy). For 1 case, this needed adaptation by slightly underdosing the planning target volume. For 2 cases with restricted tumor dose in the planning phase to avoid organ-at-risk constraint violations, fraction doses could be increased by 1 and 2 Gy because of more favorable anatomy. Daily reoptimization of both beam profiles and beam angles (third strategy) performed slightly better than reoptimization of profiles only, but the latter required only a few minutes of computation time, whereas full reoptimization took several hours. Conclusions: This simulation study demonstrated that replanning based on daily acquired computed tomography scans can improve liver stereotactic body radiation therapy dose delivery.

  14. Engineering Design and Automation in the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory.

    SciTech Connect

    Wantuck, P. J.; Hollen, R. M.

    2002-01-01

    This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and process the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation

  15. Design and development of advanced adaptive polymer lenses

    NASA Astrophysics Data System (ADS)

    Santiago, Freddie

    The dissertation presented here describes advancements made in adaptive polymer lens design and implementation. Singlets and doublets lenses were constructed for visible, short- wavelength infrared (SWIR), and middle-wavelength infrared (MWIR) applications. The lenses are implemented in a variety of tactical imaging systems to demonstrate their performance. A process was developed that defines the allowable fabrication variables, first for APL singlets and then for APL doublets. A first-order finite element model is described that enables going from an optical design to APL fabrication. This model was then extended to the design of fluidic doublets, which are equivalent to their two-element glass counter-parts. Two constant volume fluidic chambers were enclosed by three flexible membranes resulting in a variable focal length doublet. Chromatic focal shift was then used to compare numerical modeling to experimentally measured results. These same tools, methodology, and process were lastly used in the definition and fabrication of the SWIR and MWIR adaptive polymer lens for tactical systems. Imaging and illumination systems are presented, based on these lensesnotably an adaptive zoom imaging system, in the MWIR. This is the first known instance of such a system in this band.

  16. Design of a small automated telescope for Indian universities

    NASA Astrophysics Data System (ADS)

    Anandaram, Mandayam N.; Kagali, B. A.

    We have constructed a computer controlled telescope using a 0.36-m f/11 Celestron optical tube assembly for teaching and research applications. We have constructed a heavy duty fork-type equatorial mount fitted with precision machined 24 inch drive disks for both axes. These are friction driven by stepper motors through one inch rollers. We have used an open loop control system triggerable by an ST-4 CCD camera to acquire and track any target object. Our telescope can home in on any target within a range of two arc-minutes. We have employed a commercial stepper motor controller card for which we have written a user friendly pc based telescope control software in C. Photometry using a solid state photometer, and imaging by an ST-6 CCD camera are possible. We consider that this project is suitable for those wishing to construct some parts of a telescope and understand the principles of operation. A simpler model of this telescope could use DC motors instead of stepper motors. We shall be happy to send our design diagrams and details to those interested. This project was funded by the DST, and was assisted by IUCAA, Pune.

  17. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis.

    PubMed

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior-in a unified fashion-to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts

  18. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis

    PubMed Central

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior—in a unified fashion—to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial

  19. A Bayesian adaptive design with biomarkers for targeted therapies

    PubMed Central

    Eickhoff, Jens C; Kim, KyungMann; Beach, Jason; Kolesar, Jill M; Gee, Jason R

    2013-01-01

    Background Targeted therapies are becoming increasingly important for the treatment of various diseases. Biomarkers are a critical component of a targeted therapy as they can be used to identify patients who are more likely to benefit from a treatment. Targeted therapies, however, have created major challenges in the design, conduct, and analysis of clinical trials. In traditional clinical trials, treatment effects for various biomarkers are typically evaluated in an exploratory fashion and only limited information about the predictive values of biomarkers obtained. Purpose New study designs are required, which effectively evaluate both the diagnostic and the therapeutic implication of biomarkers. Methods The Bayesian approach provides a useful framework for optimizing the clinical trial design by directly integrating information about biomarkers and clinical outcomes as they become available. We propose a Bayesian covariate-adjusted response-adaptive randomization design, which utilizes individual biomarker profiles and patient's clinical outcomes as they become available during the course of the trial, to assign the most efficacious treatment to individual patients. Predictive biomarker subgroups are determined adaptively using a partial least squares regression approach. Results A series of simulation studies were conducted to examine the operating characteristics of the proposed study design. The simulation studies show that the proposed design efficiently identifies patients who benefit most from a targeted therapy and that there are substantial savings in the sample size requirements when compared to alternative designs. Limitations The design does not control for the type I error in the traditional sense and a positive result should be confirmed by conducting an independent phase III study focusing on the selected biomarker profile groups. Conclusions We conclude that the proposed design may serve a useful role in the early efficacy phase of targeted therapy

  20. Launch vehicle payload adapter design with vibration isolation features

    NASA Astrophysics Data System (ADS)

    Thomas, Gareth R.; Fadick, Cynthia M.; Fram, Bryan J.

    2005-05-01

    Payloads, such as satellites or spacecraft, which are mounted on launch vehicles, are subject to severe vibrations during flight. These vibrations are induced by multiple sources that occur between liftoff and the instant of final separation from the launch vehicle. A direct result of the severe vibrations is that fatigue damage and failure can be incurred by sensitive payload components. For this reason a payload adapter has been designed with special emphasis on its vibration isolation characteristics. The design consists of an annular plate that has top and bottom face sheets separated by radial ribs and close-out rings. These components are manufactured from graphite epoxy composites to ensure a high stiffness to weight ratio. The design is tuned to keep the frequency of the axial mode of vibration of the payload on the flexibility of the adapter to a low value. This is the main strategy adopted for isolating the payload from damaging vibrations in the intermediate to higher frequency range (45Hz-200Hz). A design challenge for this type of adapter is to keep the pitch frequency of the payload above a critical value in order to avoid dynamic interactions with the launch vehicle control system. This high frequency requirement conflicts with the low axial mode frequency requirement and this problem is overcome by innovative tuning of the directional stiffnesses of the composite parts. A second design strategy that is utilized to achieve good isolation characteristics is the use of constrained layer damping. This feature is particularly effective at keeping the responses to a minimum for one of the most important dynamic loading mechanisms. This mechanism consists of the almost-tonal vibratory load associated with the resonant burn condition present in any stage powered by a solid rocket motor. The frequency of such a load typically falls in the 45-75Hz range and this phenomenon drives the low frequency design of the adapter. Detailed finite element analysis is

  1. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  2. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  3. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGESBeta

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  4. The VIADUC project: innovation in climate adaptation through service design

    NASA Astrophysics Data System (ADS)

    Corre, L.; Dandin, P.; L'Hôte, D.; Besson, F.

    2015-07-01

    From the French National Adaptation to Climate Change Plan, the "Drias, les futurs du climat" service has been developed to provide easy access to French regional climate projections. This is a major step for the implementation of French Climate Services. The usefulness of this service for the end-users and decision makers involved with adaptation planning at a local scale is investigated. As such, the VIADUC project is: to evaluate and enhance Drias, as well as to imagine future development in support of adaptation. Climate scientists work together with end-users and a service designer. The designer's role is to propose an innovative approach based on the interaction between scientists and citizens. The chosen end-users are three Natural Regional Parks located in the South West of France. The latter parks are administrative entities which gather municipalities having a common natural and cultural heritage. They are also rural areas in which specific economic activities take place, and therefore are concerned and involved in both protecting their environment and setting-up sustainable economic development. The first year of the project has been dedicated to investigation including the questioning of relevant representatives. Three key local economic sectors have been selected: i.e. forestry, pastoral farming and building activities. Working groups were composed of technicians, administrative and maintenance staff, policy makers and climate researchers. The sectors' needs for climate information have been assessed. The lessons learned led to actions which are presented hereinafter.

  5. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    NASA Technical Reports Server (NTRS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  6. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    ERIC Educational Resources Information Center

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  7. Design of an Adaptive Secondary Mirror: A Global Approach

    NASA Astrophysics Data System (ADS)

    Brusa, Guido; del Vecchio, Ciro

    1998-07-01

    We present the mechanical and actuator design of an adaptive secondary mirror that matches the optical requirements of the active and adaptive corrections. Conceived for the particular implementation for the 6.5-m conversion of the multiple-mirror telescope, with small variations of the input parameters this study is suitable for applications for telescopes of the same class. We found that a three-layer structure, i.e., a thin deformable shell, a thick reference plate, and a third plate that acts as actuator support and heat sink, is able to provide the required mechanical stability and actuator density. We also found that a simple electromagnetic actuator can be used. This actuator, when optimized, will dissipate a typical power of a few tenths of watts.

  8. A neuro-adaptive autopilot design for guided munitions

    NASA Astrophysics Data System (ADS)

    Sharma, Manu

    This thesis treats some neural-network-based nonlinear control methodologies. In particular, a neuro-adaptive autopilot guided munitions is developed. The motivation is to reduce both the effort required, and direct cost incurred for autopilot design for these vehicles. Towards this end, a neural network is used to augment an inverting controller. The network compensates for error present in the inverting logic, thereby providing robustness to parametric uncertainty in the mathematical model of the munition. This equates to a reduced need for expensive wind-tunnel testing. Furthermore, the adaptive nature of the autopilot obviates the requirement for gain scheduling. The methodology is demonstrated on the MK-84 variant of the Joint Direct Attack Munition family of precision-guided munitions. The entire design and tuning procedure is first performed using a simulation based entirely on analytical aerodynamic data generated by Missile DATCOM. The autopilot is then tested on a second simulation, which is based on validated wind-tunnel data and tested. This last step may be viewed as a flight test. A method of augmenting existing linear controllers with neural networks is also addressed. The motivation is to introduce the benefits of adaptation without requiring modifications to the existing architecture. A framework that collapses to many classical and modern forms is considered, to which a corrective control signal is added. The corrective signal is generated by a neural network to force the plant to track a high-order response model that describes the ideal closed-loop dynamics. Subsequently, this philosophy is combined with adaptive backstepping to address unmatched uncertainties in a class of systems in strict-feedback form.

  9. Towards automated on-line adaptation of 2-Step IMRT plans: QUASIMODO phantom and prostate cancer cases

    PubMed Central

    2013-01-01

    ) 2-Step generation for the geometry of the day using the relocated isocenter, MU transfer from the planning geometry; 2) Adaptation of the widths of S2 segments to the geometry of the day; 3) Imitation of DMPO fine-tuning for the geometry of the day. Results and conclusion We have performed automated 2-Step IMRT adaptation for ten prostate adaptation cases. The adapted plans show statistically significant improvement of the target coverage and of the rectum sparing compared to those plans in which only the isocenter is relocated. The 2-Step IMRT method may become a core of the automated adaptive radiation therapy system at our department. PMID:24207129

  10. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    NASA Astrophysics Data System (ADS)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  11. An Engineered Approach to Stem Cell Culture: Automating the Decision Process for Real-Time Adaptive Subculture of Stem Cells

    PubMed Central

    Ker, Dai Fei Elmer; Weiss, Lee E.; Junkers, Silvina N.; Chen, Mei; Yin, Zhaozheng; Sandbothe, Michael F.; Huh, Seung-il; Eom, Sungeun; Bise, Ryoma; Osuna-Highley, Elvira; Kanade, Takeo; Campbell, Phil G.

    2011-01-01

    Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and developing robotic cell

  12. Optimal Pid Controller Design Using Adaptive Vurpso Algorithm

    NASA Astrophysics Data System (ADS)

    Zirkohi, Majid Moradi

    2015-04-01

    The purpose of this paper is to improve theVelocity Update Relaxation Particle Swarm Optimization algorithm (VURPSO). The improved algorithm is called Adaptive VURPSO (AVURPSO) algorithm. Then, an optimal design of a Proportional-Integral-Derivative (PID) controller is obtained using the AVURPSO algorithm. An adaptive momentum factor is used to regulate a trade-off between the global and the local exploration abilities in the proposed algorithm. This operation helps the system to reach the optimal solution quickly and saves the computation time. Comparisons on the optimal PID controller design confirm the superiority of AVURPSO algorithm to the optimization algorithms mentioned in this paper namely the VURPSO algorithm, the Ant Colony algorithm, and the conventional approach. Comparisons on the speed of convergence confirm that the proposed algorithm has a faster convergence in a less computation time to yield a global optimum value. The proposed AVURPSO can be used in the diverse areas of optimization problems such as industrial planning, resource allocation, scheduling, decision making, pattern recognition and machine learning. The proposed AVURPSO algorithm is efficiently used to design an optimal PID controller.

  13. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  14. Development of adapted GMR-probes for automated detection of hidden defects in thin steel sheets

    NASA Astrophysics Data System (ADS)

    Pelkner, Matthias; Pohl, Rainer; Kreutzbruck, Marc; Commandeur, Colin

    2016-02-01

    Thin steel sheets with a thickness of 0.3 mm and less are the base materials of many everyday life products (cans, batteries, etc.). Potential inhomogeneities such as non-metallic inclusions inside the steel can lead to a rupture of the sheets when it is formed into a product such as a beverage can. Therefore, there is a need to develop automated NDT techniques to detect hidden defects and inclusions in thin sheets during production. For this purpose Tata Steel Europe and BAM, the Federal Institute for Materials Research and Testing (Germany), collaborate in order to develop an automated NDT-system. Defect detection systems have to be robust against external influences, especially when used in an industrial environment. In addition, such a facility has to achieve a high sensitivity and a high spatial resolution in terms of detecting small inclusions in the μm-regime. In a first step, we carried out a feasibility study to determine which testing method is promising for detecting hidden defects and inclusions inside ferrous thin steel sheets. Therefore, two methods were investigated in more detail - magnetic flux leakage testing (MFL) using giant magneto resistance sensor arrays (GMR) as receivers [1,2] and eddy current testing (ET). The capabilities of both methods were tested with 0.2 mm-thick steel samples containing small defects with depths ranging from 5 µm up to 60 µm. Only in case of GMR-MFL-testing, we were able to detect parts of the hidden defects with a depth of 10 µm trustworthily with a SNR better than 10 dB. Here, the lift off between sensor and surface was 250 µm. On this basis, we investigated different testing scenarios including velocity tests and different lift offs. In this contribution we present the results of the feasibility study leading to first prototypes of GMR-probes which are now installed as part of a demonstrator inside a production line.

  15. LBT adaptive secondary units final design and construction

    NASA Astrophysics Data System (ADS)

    Gallieni, Daniele; Anaclerio, Enzo; Lazzarini, Paolo G.; Ripamonti, Angelo; Spairani, Roberto; Del Vecchio, Ciro; Salinari, Piero; Riccardi, Armando; Stefanini, Paolo; Biasi, Roberto

    2003-02-01

    The Large Binocular Telescope will perform its first level AO correction at visual wavelengths by the two Gregorian secondary mirrors. Each unit is made by a 911 mm diameter and 1.6 mm thick Zerodur shell which shape is controlled by 672 electromagnetic actuators at 1 kHz rate. The shape of each mirror is referred to a Zerodur 50 mm thick backplate through a set of capacitive sensors co-located with the actuators. Each adaptive secondary unit embeds its real time computer for actuator control and communication. Each unit is aligned into the secondary hub by a 6 d.o.f. hexapod system. The construction of the AO units started this year, while the hexapods have been completed in 2001. We present in this paper the final design of the adaptive secondary systems with particular emphasis on the modifications that we made based on the MMT adaptive secondary experience. We will also report the first results of the subsystems development tests.

  16. Future planning and evaluation for automated adaptive minehunting: a roadmap for mine countermeasures theory modernization

    NASA Astrophysics Data System (ADS)

    Garcia, Gregory A.; Wettergren, Thomas A.

    2012-06-01

    This paper presents a discussion of U.S. naval mine countermeasures (MCM) theory modernization in light of advances in the areas of autonomy, tactics, and sensor processing. The unifying theme spanning these research areas concerns the capability for in situ adaptation of processing algorithms, plans, and vehicle behaviors enabled through run-time situation assessment and performance estimation. Independently, each of these technology developments impact the MCM Measures of Effectiveness1 [MOE(s)] of time and risk by improving one or more associated Measures of Performance2 [MOP(s)]; the contribution of this paper is to outline an integrated strategy for realizing the cumulative benefits of these technology enablers to the United States Navy's minehunting capability. An introduction to the MCM problem is provided to frame the importance of the foundational research and the ramifications of the proposed strategy on the MIW community. We then include an overview of current and future adaptive capability research in the aforementioned areas, highlighting a departure from the existing rigid assumption-based approaches while identifying anticipated technology acceptance issues. Consequently, the paper describes an incremental strategy for transitioning from the current minehunting paradigm where tactical decision aids rely on a priori intelligence and there is little to no in situ adaptation or feedback to a future vision where unmanned systems3, equipped with a representation of the commander's intent, are afforded the authority and ability to adapt to environmental perturbations with minimal human-in-the-loop supervision. The discussion concludes with an articulation of the science and technology issues which the MCM research community must continue to address.

  17. Optical Design for Extremely Large Telescope Adaptive Optics Systems

    SciTech Connect

    Bauman, B J

    2003-11-26

    Designing an adaptive optics (AO) system for extremely large telescopes (ELT's) will present new optical engineering challenges. Several of these challenges are addressed in this work, including first-order design of multi-conjugate adaptive optics (MCAO) systems, pyramid wavefront sensors (PWFS's), and laser guide star (LGS) spot elongation. MCAO systems need to be designed in consideration of various constraints, including deformable mirror size and correction height. The y,{bar y} method of first-order optical design is a graphical technique that uses a plot with marginal and chief ray heights as coordinates; the optical system is represented as a segmented line. This method is shown to be a powerful tool in designing MCAO systems. From these analyses, important conclusions about configurations are derived. PWFS's, which offer an alternative to Shack-Hartmann (SH) wavefront sensors (WFS's), are envisioned as the workhorse of layer-oriented adaptive optics. Current approaches use a 4-faceted glass pyramid to create a WFS analogous to a quad-cell SH WFS. PWFS's and SH WFS's are compared and some newly-considered similarities and PWFS advantages are presented. Techniques to extend PWFS's are offered: First, PWFS's can be extended to more pixels in the image by tiling pyramids contiguously. Second, pyramids, which are difficult to manufacture, can be replaced by less expensive lenslet arrays. An approach is outlined to convert existing SH WFS's to PWFS's for easy evaluation of PWFS's. Also, a demonstration of PWFS's in sensing varying amounts of an aberration is presented. For ELT's, the finite altitude and finite thickness of LGS's means that the LGS will appear elongated from the viewpoint of subapertures not directly under the telescope. Two techniques for dealing with LGS spot elongation in SH WFS's are presented. One method assumes that the laser will be pulsed and uses a segmented micro-electromechanical system (MEMS) to track the LGS light subaperture by

  18. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. PMID:21752597

  19. Optical design of the adaptive optics laser guide star system

    SciTech Connect

    Bissinger, H.

    1994-11-15

    The design of an adaptive optics package for the 3 meter Lick telescope is presented. This instrument package includes a 69 actuator deformable mirror and a Hartmann type wavefront sensor operating in the visible wavelength; a quadrant detector for the tip-tile sensor and a tip-tilt mirror to stabilize atmospheric first order tip-tile errors. A high speed computer drives the deformable mirror to achieve near diffraction limited imagery. The different optical components and their individual design constraints are described. motorized stages and diagnostics tools are used to operate and maintain alignment throughout observation time from a remote control room. The expected performance are summarized and actual results of astronomical sources are presented.

  20. Design of a motion JPEG (M/JPEG) adapter card

    NASA Astrophysics Data System (ADS)

    Lee, D. H.; Sudharsanan, Subramania I.

    1994-05-01

    In this paper we describe a design of a high performance JPEG (Joint Photographic Experts Group) Micro Channel adapter card. The card, tested on a range of PS/2 platforms (models 50 to 95), can complete JPEG operations on a 640 by 240 pixel image within 1/60 of a second, thus enabling real-time capture and display of high quality digital video. The card accepts digital pixels for either a YUV 4:2:2 or an RGB 4:4:4 pixel bus and has been shown to handle up to 2.05 MBytes/second of compressed data. The compressed data is transmitted to a host memory area by Direct Memory Access operations. The card uses a single C-Cube's CL550 JPEG processor that complies with the baseline JPEG. We give broad descriptions of the hardware that controls the video interface, CL550, and the system interface. Some critical design points that enhance the overall performance of the M/JPEG systems are pointed out. The control of the adapter card is achieved by an interrupt driven software that runs under DOS. The software performs a variety of tasks that include change of color space (RGB or YUV), change of quantization and Huffman tables, odd and even field control and some diagnostic operations.

  1. Adaptive strategies in designing the simultaneous global drug development program.

    PubMed

    Yuan, Zhilong; Chen, Gang; Huang, Qin

    2016-01-01

    Many methods have been proposed to account for the potential impact of ethnic/regional factors when extrapolating results from multiregional clinical trials (MRCTs) to targeted ethnic (TE) patients, i.e., "bridging." Most of them either focused on TE patients in the MRCT (i.e., internal bridging) or a separate local clinical trial (LCT) (i.e., external bridging). Huang et al. (2012) integrated both bridging concepts in their method for the Simultaneous Global Drug Development Program (SGDDP) which designs both the MRCT and the LCT prospectively and combines patients in both trials by ethnic origin, i.e., TE vs. non-TE (NTE). The weighted Z test was used to combine information from TE and NTE patients to test with statistical rigor whether a new treatment is effective in the TE population. Practically, the MRCT is often completed before the LCT. Thus to increase the power for the SGDDP and/or obtain more informative data in TE patients, we may use the final results from the MRCT to re-evaluate initial assumptions (e.g., effect sizes, variances, weight), and modify the LCT accordingly. We discuss various adaptive strategies for the LCT such as sample size reassessment, population enrichment, endpoint change, and dose adjustment. As an example, we extend a popular adaptive design method to re-estimate the sample size for the LCT, and illustrate it for a normally distributed endpoint. PMID:26098138

  2. Design of infrasound-detection system via adaptive LMSTDE algorithm

    NASA Technical Reports Server (NTRS)

    Khalaf, C. S.; Stoughton, J. W.

    1984-01-01

    A proposed solution to an aviation safety problem is based on passive detection of turbulent weather phenomena through their infrasonic emission. This thesis describes a system design that is adequate for detection and bearing evaluation of infrasounds. An array of four sensors, with the appropriate hardware, is used for the detection part. Bearing evaluation is based on estimates of time delays between sensor outputs. The generalized cross correlation (GCC), as the conventional time-delay estimation (TDE) method, is first reviewed. An adaptive TDE approach, using the least mean square (LMS) algorithm, is then discussed. A comparison between the two techniques is made and the advantages of the adaptive approach are listed. The behavior of the GCC, as a Roth processor, is examined for the anticipated signals. It is shown that the Roth processor has the desired effect of sharpening the peak of the correlation function. It is also shown that the LMSTDE technique is an equivalent implementation of the Roth processor in the time domain. A LMSTDE lead-lag model, with a variable stability coefficient and a convergence criterion, is designed.

  3. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    NASA Technical Reports Server (NTRS)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  4. CRISPRmap: an automated classification of repeat conservation in prokaryotic adaptive immune systems.

    PubMed

    Lange, Sita J; Alkhnbashi, Omer S; Rose, Dominic; Will, Sebastian; Backofen, Rolf

    2013-09-01

    Central to Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-Cas systems are repeated RNA sequences that serve as Cas-protein-binding templates. Classification is based on the architectural composition of associated Cas proteins, considering repeat evolution is essential to complete the picture. We compiled the largest data set of CRISPRs to date, performed comprehensive, independent clustering analyses and identified a novel set of 40 conserved sequence families and 33 potential structure motifs for Cas-endoribonucleases with some distinct conservation patterns. Evolutionary relationships are presented as a hierarchical map of sequence and structure similarities for both a quick and detailed insight into the diversity of CRISPR-Cas systems. In a comparison with Cas-subtypes, I-C, I-E, I-F and type II were strongly coupled and the remaining type I and type III subtypes were loosely coupled to repeat and Cas1 evolution, respectively. Subtypes with a strong link to CRISPR evolution were almost exclusive to bacteria; nevertheless, we identified rare examples of potential horizontal transfer of I-C and I-E systems into archaeal organisms. Our easy-to-use web server provides an automated assignment of newly sequenced CRISPRs to our classification system and enables more informed choices on future hypotheses in CRISPR-Cas research: http://rna.informatik.uni-freiburg.de/CRISPRmap. PMID:23863837

  5. CRISPRmap: an automated classification of repeat conservation in prokaryotic adaptive immune systems

    PubMed Central

    Lange, Sita J.; Alkhnbashi, Omer S.; Rose, Dominic; Will, Sebastian; Backofen, Rolf

    2013-01-01

    Central to Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-Cas systems are repeated RNA sequences that serve as Cas-protein–binding templates. Classification is based on the architectural composition of associated Cas proteins, considering repeat evolution is essential to complete the picture. We compiled the largest data set of CRISPRs to date, performed comprehensive, independent clustering analyses and identified a novel set of 40 conserved sequence families and 33 potential structure motifs for Cas-endoribonucleases with some distinct conservation patterns. Evolutionary relationships are presented as a hierarchical map of sequence and structure similarities for both a quick and detailed insight into the diversity of CRISPR-Cas systems. In a comparison with Cas-subtypes, I-C, I-E, I-F and type II were strongly coupled and the remaining type I and type III subtypes were loosely coupled to repeat and Cas1 evolution, respectively. Subtypes with a strong link to CRISPR evolution were almost exclusive to bacteria; nevertheless, we identified rare examples of potential horizontal transfer of I-C and I-E systems into archaeal organisms. Our easy-to-use web server provides an automated assignment of newly sequenced CRISPRs to our classification system and enables more informed choices on future hypotheses in CRISPR-Cas research: http://rna.informatik.uni-freiburg.de/CRISPRmap. PMID:23863837

  6. Effects of an Advanced Reactor’s Design, Use of Automation, and Mission on Human Operators

    SciTech Connect

    Jeffrey C. Joe; Johanna H. Oxstrand

    2014-06-01

    The roles, functions, and tasks of the human operator in existing light water nuclear power plants (NPPs) are based on sound nuclear and human factors engineering (HFE) principles, are well defined by the plant’s conduct of operations, and have been validated by years of operating experience. However, advanced NPPs whose engineering designs differ from existing light-water reactors (LWRs) will impose changes on the roles, functions, and tasks of the human operators. The plans to increase the use of automation, reduce staffing levels, and add to the mission of these advanced NPPs will also affect the operator’s roles, functions, and tasks. We assert that these factors, which do not appear to have received a lot of attention by the design engineers of advanced NPPs relative to the attention given to conceptual design of these reactors, can have significant risk implications for the operators and overall plant safety if not mitigated appropriately. This paper presents a high-level analysis of a specific advanced NPP and how its engineered design, its plan to use greater levels of automation, and its expanded mission have risk significant implications on operator performance and overall plant safety.

  7. Validation of the NetSCID: an automated web-based adaptive version of the SCID.

    PubMed

    Brodey, Benjamin B; First, Michael; Linthicum, Jared; Haman, Kirsten; Sasiela, Jordan W; Ayer, David

    2016-04-01

    The present study developed and validated a configurable, adaptive, web-based version of the Structured Clinical Interview for DSM, the NetSCID. The validation included 24 clinicians who administered the SCID and 230 participants who completed the paper SCID and/or the NetSCID. Data-entry errors, branching errors, and clinician satisfaction were quantified. Relative to the paper SCID, the NetSCID resulted in far fewer data-entry and branching errors. Clinicians 'preferred' using the NetSCID and found that the NetSCID was easier to administer. PMID:26995238

  8. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  9. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  10. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  11. An automated multi-modal object analysis approach to coronary calcium scoring of adaptive heart isolated MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-02-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. This can be challenging for a human observer as it is difficult to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. The inclusion or exclusion of false positive or true positive calcified plaques respectively will alter the patient calcium score incorrectly, thus leading to the possibility of incorrect treatment prescription. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the Volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the requirement and

  12. Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate

    NASA Astrophysics Data System (ADS)

    Samaras, C.; Cook, L.

    2015-12-01

    Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.

  13. ECO fill: automated fill modification to support late-stage design changes

    NASA Astrophysics Data System (ADS)

    Davis, Greg; Wilson, Jeff; Yu, J. J.; Chiu, Anderson; Chuang, Yao-Jen; Yang, Ricky

    2014-03-01

    One of the most critical factors in achieving a positive return for a design is ensuring the design not only meets performance specifications, but also produces sufficient yield to meet the market demand. The goal of design for manufacturability (DFM) technology is to enable designers to address manufacturing requirements during the design process. While new cell-based, DP-aware, and net-aware fill technologies have emerged to provide the designer with automated fill engines that support these new fill requirements, design changes that arrive late in the tapeout process (as engineering change orders, or ECOs) can have a disproportionate effect on tapeout schedules, due to the complexity of replacing fill. If not handled effectively, the impacts on file size, run time, and timing closure can significantly extend the tapeout process. In this paper, the authors examine changes to design flow methodology, supported by new fill technology, that enable efficient, fast, and accurate adjustments to metal fill late in the design process. We present an ECO fill methodology coupled with the support of advanced fill tools that can quickly locate the portion of the design affected by the change, remove and replace only the fill in that area, while maintaining the fill hierarchy. This new fill approach effectively reduces run time, contains fill file size, minimizes timing impact, and minimizes mask costs due to ECO-driven fill changes, all of which are critical factors to ensuring time-to-market schedules are maintained.

  14. Accelerated optimization and automated discovery with covariance matrix adaptation for experimental quantum control

    NASA Astrophysics Data System (ADS)

    Roslund, Jonathan; Shir, Ofer M.; Bäck, Thomas; Rabitz, Herschel

    2009-10-01

    Optimization of quantum systems by closed-loop adaptive pulse shaping offers a rich domain for the development and application of specialized evolutionary algorithms. Derandomized evolution strategies (DESs) are presented here as a robust class of optimizers for experimental quantum control. The combination of stochastic and quasi-local search embodied by these algorithms is especially amenable to the inherent topology of quantum control landscapes. Implementation of DES in the laboratory results in efficiency gains of up to ˜9 times that of the standard genetic algorithm, and thus is a promising tool for optimization of unstable or fragile systems. The statistical learning upon which these algorithms are predicated also provide the means for obtaining a control problem’s Hessian matrix with no additional experimental overhead. The forced optimal covariance adaptive learning (FOCAL) method is introduced to enable retrieval of the Hessian matrix, which can reveal information about the landscape’s local structure and dynamic mechanism. Exploitation of such algorithms in quantum control experiments should enhance their efficiency and provide additional fundamental insights.

  15. Automated registration of large deformations for adaptive radiation therapy of prostate cancer

    SciTech Connect

    Godley, Andrew; Ahunbay, Ergun; Peng Cheng; Li, X. Allen

    2009-04-15

    Available deformable registration methods are often inaccurate over large organ variation encountered, for example, in the rectum and bladder. The authors developed a novel approach to accurately and effectively register large deformations in the prostate region for adaptive radiation therapy. A software tool combining a fast symmetric demons algorithm and the use of masks was developed in C++ based on ITK libraries to register CT images acquired at planning and before treatment fractions. The deformation field determined was subsequently used to deform the delivered dose to match the anatomy of the planning CT. The large deformations involved required that the bladder and rectum volume be masked with uniform intensities of -1000 and 1000 HU, respectively, in both the planning and treatment CTs. The tool was tested for five prostate IGRT patients. The average rectum planning to treatment contour overlap improved from 67% to 93%, the lowest initial overlap is 43%. The average bladder overlap improved from 83% to 98%, with a lowest initial overlap of 60%. Registration regions were set to include a volume receiving 4% of the maximum dose. The average region was 320x210x63, taking approximately 9 min to register on a dual 2.8 GHz Linux system. The prostate and seminal vesicles were correctly placed even though they are not masked. The accumulated doses for multiple fractions with large deformation were computed and verified. The tool developed can effectively supply the previously delivered dose for adaptive planning to correct for interfractional changes.

  16. Accelerated optimization and automated discovery with covariance matrix adaptation for experimental quantum control

    SciTech Connect

    Roslund, Jonathan; Shir, Ofer M.; Rabitz, Herschel; Baeck, Thomas

    2009-10-15

    Optimization of quantum systems by closed-loop adaptive pulse shaping offers a rich domain for the development and application of specialized evolutionary algorithms. Derandomized evolution strategies (DESs) are presented here as a robust class of optimizers for experimental quantum control. The combination of stochastic and quasi-local search embodied by these algorithms is especially amenable to the inherent topology of quantum control landscapes. Implementation of DES in the laboratory results in efficiency gains of up to {approx}9 times that of the standard genetic algorithm, and thus is a promising tool for optimization of unstable or fragile systems. The statistical learning upon which these algorithms are predicated also provide the means for obtaining a control problem's Hessian matrix with no additional experimental overhead. The forced optimal covariance adaptive learning (FOCAL) method is introduced to enable retrieval of the Hessian matrix, which can reveal information about the landscape's local structure and dynamic mechanism. Exploitation of such algorithms in quantum control experiments should enhance their efficiency and provide additional fundamental insights.

  17. Microsystem design framework based on tool adaptations and library developments

    NASA Astrophysics Data System (ADS)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  18. Relocatable, Automated Cost-Benefit Analysis for Marine Sensor Network Design

    PubMed Central

    D’Este, Claire; de Souza, Paulo; Sharman, Chris; Allen, Simon

    2012-01-01

    When designing sensor networks, we need to ensure they produce representative and relevant data, but this must be offset by the financial cost of placing sensors. We describe a novel automated method for generating and combining cost and benefit values to decide on the best sensor locations using information about the specific constraints available in most coastal locations. Costs in maintenance, negotiation, equipment, exposure and communication are estimated using hydrodynamic models and Electronic Navigation Charts. Benefits in maximum coverage and reducing overall error are also determined using model output. This method demonstrates equivalent accuracy at predicting the whole system to expert-chosen locations, whilst significantly reducing the estimated costs. PMID:22736982

  19. Wireless thermal sensor network with adaptive low power design.

    PubMed

    Lee, Ho-Yin; Chen, Shih-Lun; Chen, Chiung-An; Huang, Hong-Yi; Luo, Ching-Hsing

    2007-01-01

    There is an increasing need to develop flexible, reconfigurable, and intelligent low power wireless sensor network (WSN) system for healthcare applications. Technical advancements in micro-sensors, MEMS devices, low power electronics, and radio frequency circuits have enabled the design and development of such highly integrated system. In this paper, we present our proposed wireless thermal sensor network system, which is separated into control and data paths. Both of these paths have their own transmission frequencies. The control path sends the power and function commands from computer to each sensor elements by 2.4GHz RF circuits and the data path transmits measured data by 2.4GHz in sensor layer and 60GHz in higher layers. This hierarchy architecture would make reconfigurable mapping and pipeline applications on WSN possibly, and the average power consumption can be efficiently reduced about 60% by using the adaptive technique. PMID:18003354

  20. Toward new design-rule-check of silicon photonics for automated layout physical verifications

    NASA Astrophysics Data System (ADS)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2015-02-01

    A simple analytical model is developed to estimate the power loss and time delay in photonic integrated circuits fabricated using SOI standard wafers. This model is simple and can be utilized in physical verification of the circuit layout to verify its feasibility for fabrication using certain foundry specifications. This model allows for providing new design rules for the layout physical verification process in any electronic design automation (EDA) tool. The model is accurate and compared with finite element based full wave electromagnetic EM solver. The model is closed form and circumvents the need to utilize any EM solver for verification process. As such it dramatically reduces the time of verification process and allows fast design rule check.

  1. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  2. Design, realization and structural testing of a compliant adaptable wing

    NASA Astrophysics Data System (ADS)

    Molinari, G.; Quack, M.; Arrieta, A. F.; Morari, M.; Ermanni, P.

    2015-10-01

    This paper presents the design, optimization, realization and testing of a novel wing morphing concept, based on distributed compliance structures, and actuated by piezoelectric elements. The adaptive wing features ribs with a selectively compliant inner structure, numerically optimized to achieve aerodynamically efficient shape changes while simultaneously withstanding aeroelastic loads. The static and dynamic aeroelastic behavior of the wing, and the effect of activating the actuators, is assessed by means of coupled 3D aerodynamic and structural simulations. To demonstrate the capabilities of the proposed morphing concept and optimization procedure, the wings of a model airplane are designed and manufactured according to the presented approach. The goal is to replace conventional ailerons, thus to achieve controllability in roll purely by morphing. The mechanical properties of the manufactured components are characterized experimentally, and used to create a refined and correlated finite element model. The overall stiffness, strength, and actuation capabilities are experimentally tested and successfully compared with the numerical prediction. To counteract the nonlinear hysteretic behavior of the piezoelectric actuators, a closed-loop controller is implemented, and its capability of accurately achieving the desired shape adaptation is evaluated experimentally. Using the correlated finite element model, the aeroelastic behavior of the manufactured wing is simulated, showing that the morphing concept can provide sufficient roll authority to allow controllability of the flight. The additional degrees of freedom offered by morphing can be also used to vary the plane lift coefficient, similarly to conventional flaps. The efficiency improvements offered by this technique are evaluated numerically, and compared to the performance of a rigid wing.

  3. PFC design via FRIT Approach for Adaptive Output Feedback Control of Discrete-time Systems

    NASA Astrophysics Data System (ADS)

    Mizumoto, Ikuro; Takagi, Taro; Fukui, Sota; Shah, Sirish L.

    This paper deals with a design problem of an adaptive output feedback control for discrete-time systems with a parallel feedforward compensator (PFC) which is designed for making the augmented controlled system ASPR. A PFC design scheme by a FRIT approach with only using an input/output experimental data set will be proposed for discrete-time systems in order to design an adaptive output feedback control system. Furthermore, the effectiveness of the proposed PFC design method will be confirmed through numerical simulations by designing adaptive control system with adaptive NN (Neural Network) for an uncertain discrete-time system.

  4. Optimizing RF gun cavity geometry within an automated injector design system

    SciTech Connect

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  5. SP-100 shield design automation process using expert system and heuristic search techniques

    NASA Astrophysics Data System (ADS)

    Marcille, Thomas F.; Protsik, Robert; Deane, Nelson A.; Hoover, Darryl G.

    1993-01-01

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability.

  6. The accuracy of a designed software for automated localization of craniofacial landmarks on CBCT images

    PubMed Central

    2014-01-01

    Background Two-dimensional projection radiographs have been traditionally considered the modality of choice for cephalometric analysis. To overcome the shortcomings of two-dimensional images, three-dimensional computed tomography (CT) has been used to evaluate craniofacial structures. However, manual landmark detection depends on medical expertise, and the process is time-consuming. The present study was designed to produce software capable of automated localization of craniofacial landmarks on cone beam (CB) CT images based on image registration and to evaluate its accuracy. Methods The software was designed using MATLAB programming language. The technique was a combination of feature-based (principal axes registration) and voxel similarity-based methods for image registration. A total of 8 CBCT images were selected as our reference images for creating a head atlas. Then, 20 CBCT images were randomly selected as the test images for evaluating the method. Three experts twice located 14 landmarks in all 28 CBCT images during two examinations set 6 weeks apart. The differences in the distances of coordinates of each landmark on each image between manual and automated detection methods were calculated and reported as mean errors. Results The combined intraclass correlation coefficient for intraobserver reliability was 0.89 and for interobserver reliability 0.87 (95% confidence interval, 0.82 to 0.93). The mean errors of all 14 landmarks were <4 mm. Additionally, 63.57% of landmarks had a mean error of <3 mm compared with manual detection (gold standard method). Conclusion The accuracy of our approach for automated localization of craniofacial landmarks, which was based on combining feature-based and voxel similarity-based methods for image registration, was acceptable. Nevertheless we recommend repetition of this study using other techniques, such as intensity-based methods. PMID:25223399

  7. Feasibility of Automated Adaptive GCA (Ground Controlled Approach) Controller Training System.

    ERIC Educational Resources Information Center

    Feuge, Robert L.; And Others

    An analysis of the conceptual feasibility of using automatic speech recognition and understanding technology in the design of an advanced training system was conducted. The analysis specifically explored application to Ground Controlled Approach (GCA) controller training. A systems engineering approach was followed to determine the feasibility of…

  8. Automated endmember determination and adaptive spectral mixture analysis using kernel methods

    NASA Astrophysics Data System (ADS)

    Rand, Robert S.; Banerjee, Amit; Broadwater, Joshua

    2013-09-01

    Various phenomena occur in geographic regions that cause pixels of a scene to contain spectrally mixed pixels. The mixtures may be linear or nonlinear. It could simply be that the pixel size of a sensor is too large so many pixels contain patches of different materials within them (linear), or there could be microscopic mixtures and multiple scattering occurring within pixels (non-linear). Often enough, scenes may contain cases of both linear and non-linear mixing on a pixel-by-pixel basis. Furthermore, appropriate endmembers in a scene are not always easy to determine. A reference spectral library of materials may or may not be available, yet, even if a library is available, using it directly for spectral unmixing may not always be fruitful. This study investigates a generalized kernel-based method for spectral unmixing that attempts to determine if each pixel in a scene is linear or non-linear, and adapts to compute a mixture model at each pixel accordingly. The effort also investigates a kernel-based support vector method for determining spectral endmembers in a scene. Two scenes of hyperspectral imagery calibrated to reflectance are used to validate the methods. We test the approaches using a HyMAP scene collected over the Waimanalo Bay region in Oahu, Hawaii, as well as an AVIRIS scene collected over the oil spill region in the Gulf of Mexico during the Deepwater Horizon oil incident.

  9. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI

    PubMed Central

    Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets. PMID:26161667

  10. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect

    Rundle, W.J. ); Kartz, M.W. ); Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M. )

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an optical device controller'' (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within [plus minus]2 [mu]r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  11. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect

    Rundle, W.J.; Kartz, M.W.; Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M.

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an ``optical device controller`` (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within {plus_minus}2 {mu}r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  12. AMR++: Object-oriented design for adaptive mesh refinement

    SciTech Connect

    Quinlan, D.

    1998-12-01

    The development of object-oriented libraries for scientific computing is complicated by the wide range of applications that are targeted and the complexity and wide range of numerical methods that are used. A problem is to design a library that can be customized to handle a wide range of target applications and increasingly complex numerical methods while maintaining a sufficiently useful library for simple problems. These problems have been classically at odds with one another and have compromised the design of many object-oriented library solutions. In this paper the authors detail the mechanisms used within AMR**, and object-oriented library for Adaptive Mesh Refinement (AMR), to provide the level of extensibility that is required to make AMR++ easily customizable for the more obscure applications while remaining small and simple for less complex applications. The goal has been to have a complex applications. The goal has been to have a complexity that matches the complexity of the target application. These mechanisms are general and extend to other libraries as well.

  13. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2013-01-08

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  14. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-04-29

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre -defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  15. Issues in the design of an executive controller shell for Space Station automation

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Cheeseman, Peter C.

    1986-01-01

    A major goal of NASA's Systems Autonomy Demonstration Project is to focus research in artificial intelligence, human factors, and dynamic control systems in support of Space Station automation. Another goal is to demonstrate the use of these technologies in real space systems, for both round-based mission support and on-board operations. The design, construction, and evaluation of an intelligent autonomous system shell is recognized as an important part of the Systems Autonomy research program. His paper describes autonomous systems and executive controllers, outlines how these intelligent systems can be utilized within the Space Station, and discusses a number of key design issues that have been raised during some preliminary work to develop an autonomous executive controller shell at NASA Ames Research Center.

  16. ADS: A FORTRAN program for automated design synthesis: Version 1.10

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1985-01-01

    A new general-purpose optimization program for engineering design is described. ADS (Automated Design Synthesis - Version 1.10) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels: strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples, and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired.

  17. Automated design of gravity-assist trajectories to Mars and the outer planets

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1991-01-01

    In this paper, a new approach to planetary mission design is described which automates the search for gravity-assist trajectories. This method finds all conic solutions given a range of launch dates, a range of launch energies and a set of target planets. The new design tool is applied to the problems of finding multiple encounter trajectories to the outer planets and Venus gravity-assist trajectories to Mars. The last four-planet grand tour opportunity (until the year 2153) is identified. It requires an earth launch in 1996 and encounters Jupiter, Uranus, Neptune, and Pluto. Venus gravity-assist trajectories to Mars for the 30 year period 1995-2024 are examined. It is shown that in many cases these trajectories require less launch energy to reach Mars than direct ballistic trajectories.

  18. SIMPLE DESIGN FOR AUTOMATION OF TUNGSTEN(VI) OXIDE TECHNIQUE FOR MEASUREMENT OF NH3, AND HNO3

    EPA Science Inventory

    The tungstic acid technique for collection and analysis of NH3 and HNO3 concentrations in the ambient air has been automated in a simple and cost-effective design. The design allows complete separation of HNO3 and NH3 during detection. Unattended operation in field trials has bee...

  19. Using dual-energy x-ray imaging to enhance automated lung tumor tracking during real-time adaptive radiotherapy

    SciTech Connect

    Menten, Martin J. Fast, Martin F.; Nill, Simeon; Oelfke, Uwe

    2015-12-15

    Purpose: Real-time, markerless localization of lung tumors with kV imaging is often inhibited by ribs obscuring the tumor and poor soft-tissue contrast. This study investigates the use of dual-energy imaging, which can generate radiographs with reduced bone visibility, to enhance automated lung tumor tracking for real-time adaptive radiotherapy. Methods: kV images of an anthropomorphic breathing chest phantom were experimentally acquired and radiographs of actual lung cancer patients were Monte-Carlo-simulated at three imaging settings: low-energy (70 kVp, 1.5 mAs), high-energy (140 kVp, 2.5 mAs, 1 mm additional tin filtration), and clinical (120 kVp, 0.25 mAs). Regular dual-energy images were calculated by weighted logarithmic subtraction of high- and low-energy images and filter-free dual-energy images were generated from clinical and low-energy radiographs. The weighting factor to calculate the dual-energy images was determined by means of a novel objective score. The usefulness of dual-energy imaging for real-time tracking with an automated template matching algorithm was investigated. Results: Regular dual-energy imaging was able to increase tracking accuracy in left–right images of the anthropomorphic phantom as well as in 7 out of 24 investigated patient cases. Tracking accuracy remained comparable in three cases and decreased in five cases. Filter-free dual-energy imaging was only able to increase accuracy in 2 out of 24 cases. In four cases no change in accuracy was observed and tracking accuracy worsened in nine cases. In 9 out of 24 cases, it was not possible to define a tracking template due to poor soft-tissue contrast regardless of input images. The mean localization errors using clinical, regular dual-energy, and filter-free dual-energy radiographs were 3.85, 3.32, and 5.24 mm, respectively. Tracking success was dependent on tumor position, tumor size, imaging beam angle, and patient size. Conclusions: This study has highlighted the influence of

  20. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  1. Designs and concept reliance of a fully automated high-content screening platform.

    PubMed

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world. PMID:22797489

  2. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  3. Designs and Concept-Reliance of a Fully Automated High Content Screening Platform

    PubMed Central

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2013-01-01

    High content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand alone HCS microscopes, namely an alpha IN Cell Analyzer 3000 (INCA3000) originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run; and the IN Cell Analyzer 2000 (INCA2000) where up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 meter linear track system harboring both microscopes, plate washer, bulk dispensers, and a high capacity incubator allowing us to perform both live and fixed cell based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the New Year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the World. PMID:22797489

  4. Accelerated search for materials with targeted properties by adaptive design

    NASA Astrophysics Data System (ADS)

    Xue, Dezhen; Balachandran, Prasanna V.; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-04-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ~800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set.

  5. Designing for Change: Interoperability in a scaling and adapting environment

    NASA Astrophysics Data System (ADS)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  6. Accelerated search for materials with targeted properties by adaptive design.

    PubMed

    Xue, Dezhen; Balachandran, Prasanna V; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-01-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set. PMID:27079901

  7. Optical Design and Optimization of Translational Reflective Adaptive Optics Ophthalmoscopes

    NASA Astrophysics Data System (ADS)

    Sulai, Yusufu N. B.

    The retina serves as the primary detector for the biological camera that is the eye. It is composed of numerous classes of neurons and support cells that work together to capture and process an image formed by the eye's optics, which is then transmitted to the brain. Loss of sight due to retinal or neuro-ophthalmic disease can prove devastating to one's quality of life, and the ability to examine the retina in vivo is invaluable in the early detection and monitoring of such diseases. Adaptive optics (AO) ophthalmoscopy is a promising diagnostic tool in early stages of development, still facing significant challenges before it can become a clinical tool. The work in this thesis is a collection of projects with the overarching goal of broadening the scope and applicability of this technology. We begin by providing an optical design approach for AO ophthalmoscopes that reduces the aberrations that degrade the performance of the AO correction. Next, we demonstrate how to further improve image resolution through the use of amplitude pupil apodization and non-common path aberration correction. This is followed by the development of a viewfinder which provides a larger field of view for retinal navigation. Finally, we conclude with the development of an innovative non-confocal light detection scheme which improves the non-invasive visualization of retinal vasculature and reveals the cone photoreceptor inner segments in healthy and diseased eyes.

  8. Evidence for adaptive design in human gaze preference.

    PubMed

    Conway, C A; Jones, B C; DeBruine, L M; Little, A C

    2008-01-01

    Many studies have investigated the physical cues that influence face preferences. By contrast, relatively few studies have investigated the effects of facial cues to the direction and valence of others' social interest (i.e. gaze direction and facial expressions) on face preferences. Here we found that participants demonstrated stronger preferences for direct gaze when judging the attractiveness of happy faces than that of disgusted faces, and that this effect of expression on the strength of attraction to direct gaze was particularly pronounced for judgements of opposite-sex faces (study 1). By contrast, no such opposite-sex bias in preferences for direct gaze was observed when participants judged the same faces for likeability (study 2). Collectively, these findings for a context-sensitive opposite-sex bias in preferences for perceiver-directed smiles, but not perceiver-directed disgust, suggest gaze preference functions, at least in part, to facilitate efficient allocation of mating effort, and evince adaptive design in the perceptual mechanisms that underpin face preferences. PMID:17986435

  9. Accelerated search for materials with targeted properties by adaptive design

    PubMed Central

    Xue, Dezhen; Balachandran, Prasanna V.; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-01-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set. PMID:27079901

  10. Current Practice in Designing Training for Complex Skills: Implications for Design and Evaluation of ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Schuver-van Blanken, Marian J.; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training-Interactive Tools is a European project coordinated by the Dutch National Aerospace Laboratory. The aim of ADAPT[IT] is to create and validate an effective training design methodology, based on cognitive science and leading to the integration of advanced technologies, so that the…

  11. Design of an automated algorithm for labeling cardiac blood pool in gated SPECT images of radiolabeled red blood cells

    SciTech Connect

    Hebert, T.J. |; Moore, W.H.; Dhekne, R.D.; Ford, P.V.; Wendt, J.A.; Murphy, P.H.; Ting, Y.

    1996-08-01

    The design of an automated computer algorithm for labeling the cardiac blood pool within gated 3-D reconstructions of the radiolabeled red blood cells is investigated. Due to patient functional abnormalities, limited resolution, and noise, certain spatial and temporal features of the cardiac blood pool that one would anticipate finding in every study are not present in certain frames or with certain patients. The labeling of the cardiac blood pool requires an algorithm that only relies upon features present in all patients. The authors investigate the design of a fully-automated region growing algorithm for this purpose.

  12. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  13. Design and development of a microarray processing station (MPS) for automated miniaturized immunoassays.

    PubMed

    Pla-Roca, Mateu; Altay, Gizem; Giralt, Xavier; Casals, Alícia; Samitier, Josep

    2016-08-01

    Here we describe the design and evaluation of a fluidic device for the automatic processing of microarrays, called microarray processing station or MPS. The microarray processing station once installed on a commercial microarrayer allows automating the washing, and drying steps, which are often performed manually. The substrate where the assay occurs remains on place during the microarray printing, incubation and processing steps, therefore the addressing of nL volumes of the distinct immunoassay reagents such as capture and detection antibodies and samples can be performed on the same coordinate of the substrate with a perfect alignment without requiring any additional mechanical or optical re-alignment methods. This allows the performance of independent immunoassays in a single microarray spot. PMID:27405464

  14. Using Dynamic Simulations and Automated Decision Tools to Design Lunar Habitats

    NASA Technical Reports Server (NTRS)

    Bell, Scott; Rodriguez, Luis; Kortenkamp, David

    2005-01-01

    This paper describes the role of transient simulations, heuristic techniques, and closed loop integrated control in designing and sizing habitat life support systems. The integration of these three elements allows for more accurate requirements to be derived in advance of hardware choices. As a test case, we used a typical lunar surface habitat. Large numbers of habitat configurations were rapidly tested and evaluated using automated decision support tools. Through this process, preliminary sizing for habitat life support systems were derived. Our preliminary results show that by using transient simulations and closed loop control , we substantially reduced the system mass required to meet mission goals. This has greater implications for general systems analyses and for life support systems. It is likely that transient models, realtime integrated control, and other analyses capable of capturing the uncertainties of systems can be useful for systems analyses much earlier in the system development life cycle than has previously been considered.

  15. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  16. Research Initiatives and Preliminary Results In Automation Design In Airspace Management in Free Flight

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".

  17. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    NASA Technical Reports Server (NTRS)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  18. Biomarker-Guided Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Jorgensen, Andrea L; Kolamunnage-Dona, Ruwanthi

    2016-01-01

    Background Personalized medicine is a growing area of research which aims to tailor the treatment given to a patient according to one or more personal characteristics. These characteristics can be demographic such as age or gender, or biological such as a genetic or other biomarker. Prior to utilizing a patient’s biomarker information in clinical practice, robust testing in terms of analytical validity, clinical validity and clinical utility is necessary. A number of clinical trial designs have been proposed for testing a biomarker’s clinical utility, including Phase II and Phase III clinical trials which aim to test the effectiveness of a biomarker-guided approach to treatment; these designs can be broadly classified into adaptive and non-adaptive. While adaptive designs allow planned modifications based on accumulating information during a trial, non-adaptive designs are typically simpler but less flexible. Methods and Findings We have undertaken a comprehensive review of biomarker-guided adaptive trial designs proposed in the past decade. We have identified eight distinct biomarker-guided adaptive designs and nine variations from 107 studies. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. We have graphically displayed the current biomarker-guided adaptive trial designs and summarised the characteristics of each design. Conclusions Our in-depth overview provides future researchers with clarity in definition, methodology and terminology for biomarker-guided adaptive trial designs. PMID:26910238

  19. Adaptation Patterns as a Conceptual Tool for Designing the Adaptive Operation of CSCL Systems

    ERIC Educational Resources Information Center

    Karakostas, Anastasios; Demetriadis, Stavros

    2011-01-01

    While adaptive collaboration support has become the focus of increasingly intense research efforts in the CSCL domain, scarce, however, remain the research-based evidence on pedagogically useful ideas on what and how to adapt during the collaborative learning activity. Based principally on two studies, this work presents a compilation of…

  20. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  1. An adaptive optics imaging system designed for clinical use.

    PubMed

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R; Rossi, Ethan A

    2015-06-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2-3 arc minutes, (arcmin) 2) ~0.5-0.8 arcmin and, 3) ~0.05-0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3-5 arcmin, 2) ~0.7-1.1 arcmin and 3) ~0.07-0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing. PMID:26114033

  2. An adaptive optics imaging system designed for clinical use

    PubMed Central

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R.; Rossi, Ethan A.

    2015-01-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2–3 arc minutes, (arcmin) 2) ~0.5–0.8 arcmin and, 3) ~0.05–0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3–5 arcmin, 2) ~0.7–1.1 arcmin and 3) ~0.07–0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing. PMID:26114033

  3. Time domain and frequency domain design techniques for model reference adaptive control systems

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III

    1971-01-01

    Some problems associated with the design of model-reference adaptive control systems are considered and solutions to these problems are advanced. The stability of the adapted system is a primary consideration in the development of both the time-domain and the frequency-domain design techniques. Consequentially, the use of Liapunov's direct method forms an integral part of the derivation of the design procedures. The application of sensitivity coefficients to the design of model-reference adaptive control systems is considered. An application of the design techniques is also presented.

  4. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  5. ADS: A FORTRAN program for automated design synthesis, version 1.00

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1984-01-01

    A new general-purpose optimization program for engineering design is described. ADS-1 (Automated Design Synthesis - Version 1) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels, being strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired. The program is demonstrated with a simple structural design example.

  6. Data fusion-based design for automated fingerprint identification systems (AFIS)

    NASA Astrophysics Data System (ADS)

    Reisman, James G.; Thomopoulos, Stelios C.

    1998-07-01

    This paper presents a data fusion-based approach to designing an Automated Fingerprint Identification System (AFIS). Fingerprint matching methods vary from pattern matching, using ridge structure, orientation, or even the entire fingerprint itself, to point critical matching, using localized features such as ridge discontinuities, e.g. minutiae, or porous structures. Localized matching methods, such as minutiae, tend to yield more compact templates, in general, than pattern based methods. However, the reliability of localized features may be an issue, since they are affected adversely by the quality of the captured fingerprint, i.e. the degree of noise. Minutiae-based matching methods tend to be slower, albeit more accurate, than pattern-based methods. The trade-off in designing a cost-effective AFIS in terms of processing power (CPU) used, matching speed, and accuracy, lies in the choice of the proper matching methods that are selected to optimize performance by maximizing the matching accuracy while minimizing the search time. In this paper we present a systematic design and study of a fusion-based AFIS using a multiplicity of matching methods to optimize system performance and minimize required CPU cost.

  7. A Comparison of Testlet-Based Test Designs for Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Schnipke, Deborah L.; Reese, Lynda M.

    Two-stage and multistage test designs provide a way of roughly adapting item difficulty to test-taker ability. All test takers take a parallel stage-one test, and, based on their scores, they are routed to tests of different difficulty levels in subsequent stages. These designs provide some of the benefits of standard computerized adaptive testing…

  8. Checking design conformance and optimizing manufacturability using automated double patterning decomposition

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Ward, Brian; Barnes, Levi; Painter, Ben; Lucas, Kevin; Luk-Pat, Gerry; Wiaux, Vincent; Verhaegen, Staf; Maenhoudt, Mireille

    2008-03-01

    internally developed automated double pattern decomposition tool to investigate design compliance and describes a number of classes of non-conforming layout. Tool results then provide help to the designer to achieve robust design compliant layout.

  9. Design and analysis of the launch vehicle adapter fitting for the petite amateur navy satellite (PANSAT)

    NASA Astrophysics Data System (ADS)

    Gannon, Brian B.

    1994-09-01

    The Petite Amateur Navy Satellite (PANSAT) is a small communications satellite being developed at the Naval Postgraduate School. This thesis provides a structural design and analysis for the adapter fitting which mates PANSAT to the space shuttle Get Away Special (GAS) cannister launching system. Launch vehicle loading and interface requirements were combined with PANS AT design priorities to determine design specifications. Structural Dynamics Research Corporation's 1-DEAS Masters Series software was utilized to model two adapter designs. The finite element solver in 1-DEAS was used to analyze the two designs for strength and natural frequency. Design and analysis of fasteners, used to attach the adapter fitting to PANSAT, were also conducted. The results showed that a titanium alloy adapter, which does not shadow the solar arrays, and stainless steel fasteners exceeded all design specifications.

  10. Issues and Challenges in the Design of Culturally Adapted Evidence-Based Interventions

    PubMed Central

    Castro, Felipe González; Barrera, Manuel; Holleran Steiker, Lori K.

    2014-01-01

    This article examines issues and challenges in the design of cultural adaptations that are developed from an original evidence-based intervention (EBI). Recently emerging multistep frameworks or stage models are examined, as these can systematically guide the development of culturally adapted EBIs. Critical issues are also presented regarding whether and how such adaptations may be conducted, and empirical evidence is presented regarding the effectiveness of such cultural adaptations. Recent evidence suggests that these cultural adaptations are effective when applied with certain subcultural groups, although they are less effective when applied with other subcultural groups. Generally, current evidence regarding the effectiveness of cultural adaptations is promising but mixed. Further research is needed to obtain more definitive conclusions regarding the efficacy and effectiveness of culturally adapted EBIs. Directions for future research and recommendations are presented to guide the development of a new generation of culturally adapted EBIs. PMID:20192800

  11. Adapting Dam and Reservoir Design and Operations to Climate Change

    NASA Astrophysics Data System (ADS)

    Roy, René; Braun, Marco; Chaumont, Diane

    2013-04-01

    In order to identify the potential initiatives that the dam, reservoir and water resources systems owners and operators may undertake to cope with climate change issues, it is essential to determine the current state of knowledge of their impacts on hydrological variables at regional and local scales. Future climate scenarios derived from climate model simulations can be combined with operational hydrological modeling tools and historical observations to evaluate realistic pathways of future hydrological conditions for specific drainage basins. In the case of hydropower production those changes in hydrological conditions may have significant economic impacts. For over a decade the state owned hydropower producer Hydro Québec has been exploring the physical impacts on their watersheds by relying on climate services in collaboration with Ouranos, a consortium on regional climatology and adaptation to climate change. Previous climate change impact analysis had been including different sources of climate simulation data, explored different post-processing approaches and used hydrological impact models. At a new stage of this collaboration the operational management of Hydro Quebec aspired to carry out a cost-benefit analysis of considering climate change in the refactoring of hydro-power installations. In the process of the project not only a set of scenarios of future runoff regimes had to be defined to support long term planning decisions of a dam and reservoir operator, but also the significance of uncertainties needed to be communicated and made understood. We provide insight into a case study that took some unexpected turns and leaps by bringing together climate scientists, hydrologists and hydro-power operation managers. The study includes the selection of appropriate climate scenarios, the correction of biases, the application of hydrological models and the assessment of uncertainties. However, it turned out that communicating the science properly and

  12. Design of a Tool Integrating Force Sensing With Automated Insertion in Cochlear Implantation.

    PubMed

    Schurzig, Daniel; Labadie, Robert F; Hussong, Andreas; Rau, Thomas S; Webster, Robert J

    2012-04-01

    The quality of hearing restored to a deaf patient by a cochlear implant in hearing preservation cochlear implant surgery (and possibly also in routine cochlear implant surgery) is believed to depend on preserving delicate cochlear membranes while accurately inserting an electrode array deep into the spiral cochlea. Membrane rupture forces, and possibly, other indicators of suboptimal placement, are below the threshold detectable by human hands, motivating a force sensing insertion tool. Furthermore, recent studies have shown significant variability in manual insertion forces and velocities that may explain some instances of imperfect placement. Toward addressing this, an automated insertion tool was recently developed by Hussong et al. By following the same insertion tool concept, in this paper, we present mechanical enhancements that improve the surgeon's interface with the device and make it smaller and lighter. We also present electomechanical design of new components enabling integrated force sensing. The tool is designed to be sufficiently compact and light that it can be mounted to a microstereotactic frame for accurate image-guided preinsertion positioning. The new integrated force sensing system is capable of resolving forces as small as 0.005 N, and we provide experimental illustration of using forces to detect errors in electrode insertion. PMID:23482414

  13. A Practical Approach for Integrating Automatically Designed Fixtures with Automated Assembly Planning

    SciTech Connect

    Calton, Terri L.; Peters, Ralph R.

    1999-07-20

    This paper presents a practical approach for integrating automatically designed fixtures with automated assembly planning. Product assembly problems vary widely; here the focus is on assemblies that are characterized by a single base part to which a number of smaller parts and subassemblies are attached. This method starts with three-dimension at CAD descriptions of an assembly whose assembly tasks require a fixture to hold the base part. It then combines algorithms that automatically design assembly pallets to hold the base part with algorithms that automatically generate assembly sequences. The designed fixtures rigidly constrain and locate the part, obey task constraints, are robust to part shape variations, are easy to load, and are economical to produce. The algorithm is guaranteed to find the global optimum solution that satisfies these and other pragmatic conditions. The assembly planner consists of four main elements: a user interface, a constraint system, a search engine, and an animation module. The planner expresses all constraints at a sequencing level, specifying orders and conditions on part mating operations in a number of ways. Fast replanning enables an interactive plan-view-constrain-replan cycle that aids in constrain discovery and documentation. The combined algorithms guarantee that the fixture will hold the base part without interfering with any of the assembly operations. This paper presents an overview of the planners, the integration approach, and the results of the integrated algorithms applied to several practical manufacturing problems. For these problems initial high-quality fixture designs and assembly sequences are generated in a matter of minutes with global optimum solutions identified in just over an hour.

  14. Design of adaptive objective lens for ultrabroad near infrared imaging

    NASA Astrophysics Data System (ADS)

    Lan, Gongpu; Li, Guoqiang

    2016-03-01

    We present a compound adaptive objective lens in which a water-filled membrane lens is inserted into a front group (one lens) and a back group (two lenses). This adaptive objective lens works in the ultrabroad near infrared waveband (760nm ~ 920nm) with the volume scan of > 1mm3 and the resolution of 2.8 μm (calculated at the wavelength of 840 nm). The focal range is 19.5mm ~ 20.5mm and the numerical number is 0.196. The size of the adaptive lens is 10mm (diameter) × 17mm (length). This kind of lens can be widely used in three-dimensional (3D) volume biomedical imaging instruments, such as confocal microscope, optical coherence tomography (OCT), two photon microscope, etc.

  15. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    SciTech Connect

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-05-12

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper.

  16. An Automatic Online Calibration Design in Adaptive Testing

    ERIC Educational Resources Information Center

    Makransky, Guido; Glas, Cees A. W.

    2010-01-01

    An accurately calibrated item bank is essential for a valid computerized adaptive test. However, in some settings, such as occupational testing, there is limited access to test takers for calibration. As a result of the limited access to possible test takers, collecting data to accurately calibrate an item bank in an occupational setting is…

  17. Design of Adaptive Hypermedia Learning Systems: A Cognitive Style Approach

    ERIC Educational Resources Information Center

    Mampadi, Freddy; Chen, Sherry Y.; Ghinea, Gheorghita; Chen, Ming-Puu

    2011-01-01

    In the past decade, a number of adaptive hypermedia learning systems have been developed. However, most of these systems tailor presentation content and navigational support solely according to students' prior knowledge. On the other hand, previous research suggested that cognitive styles significantly affect student learning because they refer to…

  18. Program user's manual for optimizing the design of a liquid or gaseous propellant rocket engine with the automated combustor design code AUTOCOM

    NASA Technical Reports Server (NTRS)

    Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.

    1973-01-01

    This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.

  19. Human factors design of automated highway systems: First generation scenarios. Final report, 1 November 1992-1 May 1993

    SciTech Connect

    Tsao, H.S.J.; Hall, R.W.; Shladover, S.E.; Plocher, T.A.; Levitan, L.J.

    1994-12-01

    Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process defining driver roles and driver-system interface requirements for AHS is the definition of system visions and operational scenarios. These scenarios then become the basis for first identifying driver functions and information requirements, and, later, designing the driver`s interface to the AHS. In addition, the scenarios provide a framework within which variables that potentially impact the driver can be explored systematically. Seven AHS operational scenarios, each describing a different AHS vision, were defined by varying three system dimensions with special significance for the driver. These three dimensions are: (1) the degree to which automated and manual traffic is separated, (2) the rules for vehicle following and spacing, and (3) the level of automation in traffic flow control. The seven scenarios vary in the complexity of the automated and manual driving maneuvers required, the physical space allowed for maneuvers, and the nature of the resulting demands placed on the driver. Each scenario describes the physical configuration of the system, operational events from entry to exist, and high-level driver functions.

  20. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  1. Small Volume Flow Probe for Automated Direct-Injection NMR Analysis: Design and Performance

    NASA Astrophysics Data System (ADS)

    Haner, Ronald L.; Llanos, William; Mueller, Luciano

    2000-03-01

    A detailed characterization of an NMR flow probe for use in direct-injection sample analysis is presented. A 600-MHz, indirect detection NMR flow probe with a 120-μl active volume is evaluated in two configurations: first as a stand-alone small volume probe for the analysis of static, nonflowing solutions, and second as a component in an integrated liquids-handling system used for high-throughput NMR analysis. In the stand-alone mode, 1H lineshape, sensitivity, radiofrequency (RF) homogeneity, and heat transfer characteristics are measured and compared to conventional-format NMR probes of related design. Commonly used descriptive terminology for the hardware, sample regions, and RF coils are reviewed or defined, and test procedures developed for flow probes are described. The flow probe displayed general performance that is competitive with standard probes. Key advantages of the flow probe include high molar sensitivity, ease of use in an automation setup, and superior reproducibility of magnetic field homogeneity which enables the practical implementation of 1D T2-edited analysis of protein-ligand interactions.

  2. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers

    PubMed Central

    Espah Borujeni, Amin; Mishler, Dennis M.; Wang, Jingzhi; Huso, Walker; Salis, Howard M.

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription–translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  3. A novel automated instrument designed to determine photosensitivity thresholds (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Aguilar, Mariela C.; Gonzalez, Alex; Rowaan, Cornelis; De Freitas, Carolina; Rosa, Potyra R.; Alawa, Karam; Lam, Byron L.; Parel, Jean-Marie A.

    2016-03-01

    As there is no clinically available instrument to systematically and reliably determine the photosensitivity thresholds of patients with dry eyes, blepharospasms, migraines, traumatic brain injuries, and genetic disorders such as Achromatopsia, retinitis pigmentosa and other retinal dysfunctions, a computer-controlled optoelectronics system was designed. The BPEI Photosensitivity System provides a light stimuli emitted from a bi-cupola concave, 210 white LED array with varying intensity ranging from 1 to 32,000 lux. The system can either utilize a normal or an enhanced testing mode for subjects with low light tolerance. The automated instrument adjusts the intensity of each light stimulus. The subject is instructed to indicate discomfort by pressing a hand-held button. Reliability of the responses is tracked during the test. The photosensitivity threshold is then calculated after 10 response reversals. In a preliminary study, we demonstrated that subjects suffering from Achromatopsia experienced lower photosensitivity thresholds than normal subjects. Hence, the system can safely and reliably determine the photosensitivity thresholds of healthy and light sensitive subjects by detecting and quantifying the individual differences. Future studies will be performed with this system to determine the photosensitivity threshold differences between normal subjects and subjects suffering from other conditions that affect light sensitivity.

  4. Design and utilization of the drug-excipient chemical compatibility automated system.

    PubMed

    Thomas, V Hayden; Naath, Maryanne

    2008-07-01

    To accelerate clinical formulation development, an excipient compatibility screen should be conducted as early as possible and it must be rapid, robust and resource sparing. This however, does not describe the traditional excipient compatibility testing approach, requiring many tedious and labor intensive manual operations. This study focused on transforming traditional practices into a completely automated screening process to increase sample throughput and realign resources to more urgent areas, while maintaining quality. Using the developed system, a complete on-line performance study was conducted whereby drug-excipient mixtures were weighed, blended and subjected to accelerated stress stability for up to 1 month, followed by sample extraction and HPLC analysis. Compared to off-line traditional study protocols, the system provided similar relative rank order results with equivalent precision and accuracy, while increasing sample throughput. The designed system offers a resource sparing primary screen for drug-excipient chemical compatibility for solid dosage form development. This approach allows risk assessment analysis, based upon formulation complexity, to be conducted prior to the commitment of resources and candidate selection for clinical development. PMID:18486368

  5. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers.

    PubMed

    Espah Borujeni, Amin; Mishler, Dennis M; Wang, Jingzhi; Huso, Walker; Salis, Howard M

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription-translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  6. Automated digital microfluidic platform for magnetic-particle-based immunoassays with optimization by design of experiments.

    PubMed

    Choi, Kihwan; Ng, Alphonsus H C; Fobel, Ryan; Chang-Yen, David A; Yarnell, Lyle E; Pearson, Elroy L; Oleksak, Carl M; Fischer, Andrew T; Luoma, Robert P; Robinson, John M; Audet, Julie; Wheeler, Aaron R

    2013-10-15

    We introduce an automated digital microfluidic (DMF) platform capable of performing immunoassays from sample to analysis with minimal manual intervention. This platform features (a) a 90 Pogo pin interface for digital microfluidic control, (b) an integrated (and motorized) photomultiplier tube for chemiluminescent detection, and (c) a magnetic lens assembly which focuses magnetic fields into a narrow region on the surface of the DMF device, facilitating up to eight simultaneous digital microfluidic magnetic separations. The new platform was used to implement a three-level full factorial design of experiments (DOE) optimization for thyroid-stimulating hormone immunoassays, varying (1) the analyte concentration, (2) the sample incubation time, and (3) the sample volume, resulting in an optimized protocol that reduced the detection limit and sample incubation time by up to 5-fold and 2-fold, respectively, relative to those from previous work. To our knowledge, this is the first report of a DOE optimization for immunoassays in a microfluidic system of any format. We propose that this new platform paves the way for a benchtop tool that is useful for implementing immunoassays in near-patient settings, including community hospitals, physicians' offices, and small clinical laboratories. PMID:23978190

  7. A mathematical basis for the design and design optimization of adaptive trusses in precision control

    NASA Technical Reports Server (NTRS)

    Das, S. K.; Utku, S.; Chen, G.-S.; Wada, B. K.

    1991-01-01

    A mathematical basis for the optimal design of adaptive trusses to be used in supporting precision equipment is provided. The general theory of adaptive structures is introduced, and the global optimization problem of placing a limited number, q, of actuators, so as to maximally achieve precision control and provide prestress, is stated. Two serialized optimization problems, namely, optimal actuator placement for prestress and optimal actuator placement for precision control, are addressed. In the case of prestressing, the computation of a 'desired' prestress is discussed, the interaction between actuators and redundants in conveying the prestress is shown in its mathematical form, and a methodology for arriving at the optimal placement of actuators and additional redundants is discussed. With regard to precision control, an optimal placement scheme (for q actuators) for maximum 'authority' over the precision points is suggested. The results of the two serialized optimization problems are combined to give a suboptimal solution to the global optimization problem. A method for improving this suboptimal actuator placement scheme by iteration is presented.

  8. Design and comparison of 8x8 optical switches with adaptive wavelength routing algorithm

    NASA Astrophysics Data System (ADS)

    Tsao, Shyh-Lin; Lu, Yu M.

    2001-12-01

    In this paper, some wavelength routers with various 8 X 8 optical wavelength-switching networks are designed. All of the wavelength routers have three stages architecture. We also analyze the wavelength crosstalk, SNR and BER for various 8 X 8 optical switching networks for adaptive wavelength routing choice. The analysis shows the performance adaptive of routing networks. The 8 X 8 dilated Benes optical switches that adaptive router closed will the best performance among the wavelength routers.

  9. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  10. Decentralized adaptive control of robot manipulators with robust stabilization design

    NASA Technical Reports Server (NTRS)

    Yuan, Bau-San; Book, Wayne J.

    1988-01-01

    Due to geometric nonlinearities and complex dynamics, a decentralized technique for adaptive control for multilink robot arms is attractive. Lyapunov-function theory for stability analysis provides an approach to robust stabilization. Each joint of the arm is treated as a component subsystem. The adaptive controller is made locally stable with servo signals including proportional and integral gains. This results in the bound on the dynamical interactions with other subsystems. A nonlinear controller which stabilizes the system with uniform boundedness is used to improve the robustness properties of the overall system. As a result, the robot tracks the reference trajectories with convergence. This strategy makes computation simple and therefore facilitates real-time implementation.

  11. Context-Adaptive Learning Designs by Using Semantic Web Services

    ERIC Educational Resources Information Center

    Dietze, Stefan; Gugliotta, Alessio; Domingue, John

    2007-01-01

    IMS Learning Design (IMS-LD) is a promising technology aimed at supporting learning processes. IMS-LD packages contain the learning process metadata as well as the learning resources. However, the allocation of resources--whether data or services--within the learning design is done manually at design-time on the basis of the subjective appraisals…

  12. Design and implementation of adaptive PI control schemes for web tension control in roll-to-roll (R2R) manufacturing.

    PubMed

    Raul, Pramod R; Pagilla, Prabhakar R

    2015-05-01

    In this paper, two adaptive Proportional-Integral (PI) control schemes are designed and discussed for control of web tension in Roll-to-Roll (R2R) manufacturing systems. R2R systems are used to transport continuous materials (called webs) on rollers from the unwind roll to the rewind roll. Maintaining web tension at the desired value is critical to many R2R processes such as printing, coating, lamination, etc. Existing fixed gain PI tension control schemes currently used in industrial practice require extensive tuning and do not provide the desired performance for changing operating conditions and material properties. The first adaptive PI scheme utilizes the model reference approach where the controller gains are estimated based on matching of the actual closed-loop tension control systems with an appropriately chosen reference model. The second adaptive PI scheme utilizes the indirect adaptive control approach together with relay feedback technique to automatically initialize the adaptive PI gains. These adaptive tension control schemes can be implemented on any R2R manufacturing system. The key features of the two adaptive schemes is that their designs are simple for practicing engineers, easy to implement in real-time, and automate the tuning process. Extensive experiments are conducted on a large experimental R2R machine which mimics many features of an industrial R2R machine. These experiments include trials with two different polymer webs and a variety of operating conditions. Implementation guidelines are provided for both adaptive schemes. Experimental results comparing the two adaptive schemes and a fixed gain PI tension control scheme used in industrial practice are provided and discussed. PMID:25555757

  13. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  14. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    PubMed Central

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  15. Towards Individualized Online Learning: The Design and Development of an Adaptive Web Based Learning Environment

    ERIC Educational Resources Information Center

    Inan, Fethi A.; Flores, Raymond; Ari, Fatih; Arslan-Ari, Ismahan

    2011-01-01

    The purpose of this study was to document the design and development of an adaptive system which individualizes instruction such as content, interfaces, instructional strategies, and resources dependent on two factors, namely student motivation and prior knowledge levels. Combining adaptive hypermedia methods with strategies proposed by…

  16. A Web-Based Adaptive Tutor to Teach PCR Primer Design

    ERIC Educational Resources Information Center

    van Seters, Janneke R.; Wellink, Joan; Tramper, Johannes; Goedhart, Martin J.; Ossevoort, Miriam A.

    2012-01-01

    When students have varying prior knowledge, personalized instruction is desirable. One way to personalize instruction is by using adaptive e-learning to offer training of varying complexity. In this study, we developed a web-based adaptive tutor to teach PCR primer design: the PCR Tutor. We used part of the Taxonomy of Educational Objectives (the…

  17. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  18. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  19. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    NASA Astrophysics Data System (ADS)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  20. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  1. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  2. Adapting the Mathematical Task Framework to Design Online Didactic Objects

    ERIC Educational Resources Information Center

    Bowers, Janet; Bezuk, Nadine; Aguilar, Karen

    2011-01-01

    Designing didactic objects involves imagining how students can conceive of specific mathematical topics and then imagining what types of classroom discussions could support these mental constructions. This study investigated whether it was possible to design Java applets that might serve as didactic objects to support online learning where…

  3. CHANGE WITHOUT BUYING: AN APPLICATION OF ADAPTABLE DESIGN IN APPAREL

    EPA Science Inventory

    Female college students typically wear casual clothing on campus. To meet this need and make our design to be used more often and longer by female college students, we decided to focus on casual wear design. Cotton fibers, with relatively poor resiliency, are commonly used in ...

  4. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  5. Adaptive optoelectronic camouflage systems with designs inspired by cephalopod skins.

    PubMed

    Yu, Cunjiang; Li, Yuhang; Zhang, Xun; Huang, Xian; Malyarchuk, Viktor; Wang, Shuodao; Shi, Yan; Gao, Li; Su, Yewang; Zhang, Yihui; Xu, Hangxun; Hanlon, Roger T; Huang, Yonggang; Rogers, John A

    2014-09-01

    Octopus, squid, cuttlefish, and other cephalopods exhibit exceptional capabilities for visually adapting to or differentiating from the coloration and texture of their surroundings, for the purpose of concealment, communication, predation, and reproduction. Long-standing interest in and emerging understanding of the underlying ultrastructure, physiological control, and photonic interactions has recently led to efforts in the construction of artificial systems that have key attributes found in the skins of these organisms. Despite several promising options in active materials for mimicking biological color tuning, existing routes to integrated systems do not include critical capabilities in distributed sensing and actuation. Research described here represents progress in this direction, demonstrated through the construction, experimental study, and computational modeling of materials, device elements, and integration schemes for cephalopod-inspired flexible sheets that can autonomously sense and adapt to the coloration of their surroundings. These systems combine high-performance, multiplexed arrays of actuators and photodetectors in laminated, multilayer configurations on flexible substrates, with overlaid arrangements of pixelated, color-changing elements. The concepts provide realistic routes to thin sheets that can be conformally wrapped onto solid objects to modulate their visual appearance, with potential relevance to consumer, industrial, and military applications. PMID:25136094

  6. Adaptive optoelectronic camouflage systems with designs inspired by cephalopod skins

    PubMed Central

    Yu, Cunjiang; Li, Yuhang; Zhang, Xun; Huang, Xian; Malyarchuk, Viktor; Wang, Shuodao; Shi, Yan; Gao, Li; Su, Yewang; Zhang, Yihui; Xu, Hangxun; Hanlon, Roger T.; Huang, Yonggang; Rogers, John A.

    2014-01-01

    Octopus, squid, cuttlefish, and other cephalopods exhibit exceptional capabilities for visually adapting to or differentiating from the coloration and texture of their surroundings, for the purpose of concealment, communication, predation, and reproduction. Long-standing interest in and emerging understanding of the underlying ultrastructure, physiological control, and photonic interactions has recently led to efforts in the construction of artificial systems that have key attributes found in the skins of these organisms. Despite several promising options in active materials for mimicking biological color tuning, existing routes to integrated systems do not include critical capabilities in distributed sensing and actuation. Research described here represents progress in this direction, demonstrated through the construction, experimental study, and computational modeling of materials, device elements, and integration schemes for cephalopod-inspired flexible sheets that can autonomously sense and adapt to the coloration of their surroundings. These systems combine high-performance, multiplexed arrays of actuators and photodetectors in laminated, multilayer configurations on flexible substrates, with overlaid arrangements of pixelated, color-changing elements. The concepts provide realistic routes to thin sheets that can be conformally wrapped onto solid objects to modulate their visual appearance, with potential relevance to consumer, industrial, and military applications. PMID:25136094

  7. Adapting the mathematical task framework to design online didactic objects

    NASA Astrophysics Data System (ADS)

    Bowers, Janet; Bezuk, Nadine; Aguilar, Karen

    2011-06-01

    Designing didactic objects involves imagining how students can conceive of specific mathematical topics and then imagining what types of classroom discussions could support these mental constructions. This study investigated whether it was possible to design Java applets that might serve as didactic objects to support online learning where 'discussions' are broadly defined as the conversations students have with themselves as they interact with the dynamic mathematical representations on the screen. Eighty-four pre-service elementary teachers enrolled in hybrid mathematics courses were asked to interact with a series of applets designed to support their understanding of qualitative graphing. The results of the surveys indicate that various design features of the applets did in fact cause perturbations and opportunities for resolutions that enabled the users to 'discuss' their learning by reflecting on their in-class discussions and online activities. The discussion includes four design features for guiding future applet creation.

  8. 75 FR 8968 - Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... HUMAN SERVICES Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical... entitled ``Adaptive Design Clinical Trials for Drugs and Biologics.'' The draft guidance provides sponsors... Evaluation and Research (CBER) with information regarding adaptive design clinical trials when used in...

  9. Design of smart composite platforms for adaptive trust vector control and adaptive laser telescope for satellite applications

    NASA Astrophysics Data System (ADS)

    Ghasemi-Nejhad, Mehrdad N.

    2013-04-01

    This paper presents design of smart composite platforms for adaptive trust vector control (TVC) and adaptive laser telescope for satellite applications. To eliminate disturbances, the proposed adaptive TVC and telescope systems will be mounted on two analogous smart composite platform with simultaneous precision positioning (pointing) and vibration suppression (stabilizing), SPPVS, with micro-radian pointing resolution, and then mounted on a satellite in two different locations. The adaptive TVC system provides SPPVS with large tip-tilt to potentially eliminate the gimbals systems. The smart composite telescope will be mounted on a smart composite platform with SPPVS and then mounted on a satellite. The laser communication is intended for the Geosynchronous orbit. The high degree of directionality increases the security of the laser communication signal (as opposed to a diffused RF signal), but also requires sophisticated subsystems for transmission and acquisition. The shorter wavelength of the optical spectrum increases the data transmission rates, but laser systems require large amounts of power, which increases the mass and complexity of the supporting systems. In addition, the laser communication on the Geosynchronous orbit requires an accurate platform with SPPVS capabilities. Therefore, this work also addresses the design of an active composite platform to be used to simultaneously point and stabilize an intersatellite laser communication telescope with micro-radian pointing resolution. The telescope is a Cassegrain receiver that employs two mirrors, one convex (primary) and the other concave (secondary). The distance, as well as the horizontal and axial alignment of the mirrors, must be precisely maintained or else the optical properties of the system will be severely degraded. The alignment will also have to be maintained during thruster firings, which will require vibration suppression capabilities of the system as well. The innovative platform has been

  10. Automated registration of laser Doppler perfusion images by an adaptive correlation approach: application to focal cerebral ischemia in the rat.

    PubMed

    Riyamongkol, Panomkhawn; Zhao, Weizhao; Liu, Yitao; Belayev, Ludmila; Busto, Raul; Ginsberg, Myron D

    2002-12-31

    Hemodynamic changes are extremely important in analyzing responses from a brain subjected to a stimulus or treatment. The Laser Doppler technique has emerged as an important tool in neuroscience research. This non-invasive method scans a low-power laser beam in a raster pattern over a tissue surface to generate the time course of images in unit of relative flux changes. Laser Doppler imager (LDI) records cerebral perfusion not only in the temporal but also in the spatial domain. The traditional analysis of LD images has been focused on the region-of-interest (ROI) approach, in which the analytical accuracy in an experiment that necessitates a relative repositioning between the LDI and the scanned tissue area will be weakened due to the operator's subjective decision in data collecting. This report describes a robust image registration method designed to obviate this problem, which is based on the adaptive correlation approach. The assumption in mapping corresponding pixels in two images is to correlate the regions in which these pixels are centered. Based on this assumption, correlation coefficients are calculated between two regions by a method in which one region is moved around over the other in all possible combinations. To avoid ambiguity in distinguishing maximum correlation coefficients, an adaptive algorithm is adopted. Correspondences are then used to estimate the transformation by linear regression. We used a pair of phantom LD images to test this algorithm. A reliability test was also performed on each of the 15 sequential LD images derived from an actual experiment by imposing rotation and translation. The result shows that the calculated transformation parameters (rotation: theta =7.7+/-0.5 degrees; translation: Delta x =2.8+/-0.3, Deltaŷ=4.7+/-0.4) are very close to the prior-set parameters (rotation: theta=8 degrees; translation: Delta x=3, Delta y=5). This result indicates that this approach is a valuable adjunct to LD perfusion monitoring. An

  11. Design of artificial genetic regulatory networks with multiple delayed adaptive responses*

    NASA Astrophysics Data System (ADS)

    Kaluza, Pablo; Inoue, Masayo

    2016-06-01

    Genetic regulatory networks with adaptive responses are widely studied in biology. Usually, models consisting only of a few nodes have been considered. They present one input receptor for activation and one output node where the adaptive response is computed. In this work, we design genetic regulatory networks with many receptors and many output nodes able to produce delayed adaptive responses. This design is performed by using an evolutionary algorithm of mutations and selections that minimizes an error function defined by the adaptive response in signal shapes. We present several examples of network constructions with a predefined required set of adaptive delayed responses. We show that an output node can have different kinds of responses as a function of the activated receptor. Additionally, complex network structures are presented since processing nodes can be involved in several input-output pathways.

  12. SMARTer Discontinuation Trial Designs for Developing an Adaptive Treatment Strategy

    PubMed Central

    Compton, Scott N.; Rynn, Moira A.; Walkup, John T.; Murphy, Susan A.

    2012-01-01

    Abstract Objective Developing evidenced-based practices for the management of childhood psychiatric disorders requires research studies that address how to treat children during both the acute phase of the disorder and beyond. Given the selection of a medication for acute treatment, discontinuation trials are used to evaluate the effects of treatment duration (e.g., time on medication) and/or maintenance strategies following successful acute-phase treatment. Recently, sequential multiple assignment randomized trials (SMART) have been proposed for use in informing sequences of critical clinical decisions such as those mentioned. The objective of this article is to illustrate how a SMART study is related to the standard discontinuation trial design, while addressing additional clinically important questions with similar trial resources. Method The recently completed Child/Adolescent Anxiety Multimodal Study (CAMS), a randomized trial that examined the relative efficacy of three acute-phase treatments for pediatric anxiety disorders, along with a next logical step, a standard discontinuation trial design, is used to clarify the ideas. This example is used to compare the discontinuation trial design relative to the SMART design. Results We find that the standard discontinuation trial can be modified slightly using a SMART design to yield high-quality data that can be used to address a wider variety of questions in addition to the impact of treatment duration. We discuss how this innovative trial design is ultimately more efficient and less costly than the standard discontinuation trial, and may result in more representative comparisons between treatments. Conclusions Mental health researchers who are interested in addressing questions concerning the effects of continued treatment (for different durations) following successful acute-phase treatment should consider SMART designs in place of discontinuation trial designs in their research. SMART designs can be used to

  13. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  14. Water Infrastructure Adaptation in New Urban Design: Possibilities and Constraints

    EPA Science Inventory

    Natural constraints, including climate change and dynamic socioeconomic development, can significantly impact the way we plan, design, and operate water infrastructure, thus its sustainability to deliver reliable quality water supplies and comply with environmental regulations. ...

  15. Twenty-five years of confirmatory adaptive designs: opportunities and pitfalls.

    PubMed

    Bauer, Peter; Bretz, Frank; Dragalin, Vladimir; König, Franz; Wassmer, Gernot

    2016-02-10

    'Multistage testing with adaptive designs' was the title of an article by Peter Bauer that appeared 1989 in the German journal Biometrie und Informatik in Medizin und Biologie. The journal does not exist anymore but the methodology found widespread interest in the scientific community over the past 25 years. The use of such multistage adaptive designs raised many controversial discussions from the beginning on, especially after the publication by Bauer and Köhne 1994 in Biometrics: Broad enthusiasm about potential applications of such designs faced critical positions regarding their statistical efficiency. Despite, or possibly because of, this controversy, the methodology and its areas of applications grew steadily over the years, with significant contributions from statisticians working in academia, industry and agencies around the world. In the meantime, such type of adaptive designs have become the subject of two major regulatory guidance documents in the US and Europe and the field is still evolving. Developments are particularly noteworthy in the most important applications of adaptive designs, including sample size reassessment, treatment selection procedures, and population enrichment designs. In this article, we summarize the developments over the past 25 years from different perspectives. We provide a historical overview of the early days, review the key methodological concepts and summarize regulatory and industry perspectives on such designs. Then, we illustrate the application of adaptive designs with three case studies, including unblinded sample size reassessment, adaptive treatment selection, and adaptive endpoint selection. We also discuss the availability of software for evaluating and performing such designs. We conclude with a critical review of how expectations from the beginning were fulfilled, and - if not - discuss potential reasons why this did not happen. PMID:25778935

  16. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  17. Advanced Air Traffic Management Research (Human Factors and Automation): NASA Research Initiatives in Human-Centered Automation Design in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Condon, Gregory W. (Technical Monitor)

    1996-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. The core processes of control and the distribution of decision making in that control are undergoing extensive analysis. From our perspective, the human operators and the procedures by which they interact are the fundamental determinants of the safe, efficient, and flexible operation of the system. In that perspective, we have begun to explore what our experience has taught will be the most challenging aspects of designing and integrating human-centered automation in the advanced system. We have performed a full mission simulation looking at the role shift to self-separation on board the aircraft with the rules of the air guiding behavior and the provision of a cockpit display of traffic information and an on-board traffic alert system that seamlessly integrates into the TCAS operations. We have performed and initial investigation of the operational impact of "Dynamic Density" metrics on controller relinquishing and reestablishing full separation authority. (We follow the assumption that responsibility at all times resides with the controller.) This presentation will describe those efforts as well as describe the process by which we will guide the development of error tolerant systems that are sensitive to shifts in operator work load levels and dynamic shifts in the operating point of air traffic management.

  18. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  19. Adapting Wood Technology to Teach Design and Engineering

    ERIC Educational Resources Information Center

    Rummel, Robert A.

    2012-01-01

    Technology education has changed dramatically over the last few years. The transition of industrial arts to technology education and more recently the pursuit of design and engineering has resulted in technology education teachers often needing to change their curriculum and course activities to meet the demands of a rapidly changing profession.…

  20. RESIDENTIAL BUILDING ADAPTIVE ENERGY MANAGEMENT SYSTEM (R-BAEMS) DESIGN

    EPA Science Inventory

    The expected outcomes from Phase I included 1) a set of guidelines for implementing R-BAEMS in residential structures from both a retrofit and original design perspective and 2) a cost and energy analysis of R-BAEMS impact on the environment. The status of each of the outcomes...

  1. Optimal adaptive two-stage designs for early phase II clinical trials.

    PubMed

    Shan, Guogen; Wilding, Gregory E; Hutson, Alan D; Gerstenberger, Shawn

    2016-04-15

    Simon's optimal two-stage design has been widely used in early phase clinical trials for Oncology and AIDS studies with binary endpoints. With this approach, the second-stage sample size is fixed when the trial passes the first stage with sufficient activity. Adaptive designs, such as those due to Banerjee and Tsiatis (2006) and Englert and Kieser (2013), are flexible in the sense that the second-stage sample size depends on the response from the first stage, and these designs are often seen to reduce the expected sample size under the null hypothesis as compared with Simon's approach. An unappealing trait of the existing designs is that they are not associated with a second-stage sample size, which is a non-increasing function of the first-stage response rate. In this paper, an efficient intelligent process, the branch-and-bound algorithm, is used in extensively searching for the optimal adaptive design with the smallest expected sample size under the null, while the type I and II error rates are maintained and the aforementioned monotonicity characteristic is respected. The proposed optimal design is observed to have smaller expected sample sizes compared to Simon's optimal design, and the maximum total sample size of the proposed adaptive design is very close to that from Simon's method. The proposed optimal adaptive two-stage design is recommended for use in practice to improve the flexibility and efficiency of early phase therapeutic development. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26526165

  2. The emergence of grid cells: Intelligent design or just adaptation?

    PubMed

    Kropff, Emilio; Treves, Alessandro

    2008-01-01

    Individual medial entorhinal cortex (mEC) 'grid' cells provide a representation of space that appears to be essentially invariant across environments, modulo simple transformations, in contrast to multiple, rapidly acquired hippocampal maps; it may therefore be established gradually during rodent development. We explore with a simplified mathematical model the possibility that the self-organization of multiple grid fields into a triangular grid pattern may be a single-cell process, driven by firing rate adaptation and slowly varying spatial inputs. A simple analytical derivation indicates that triangular grids are favored asymptotic states of the self-organizing system, and computer simulations confirm that such states are indeed reached during a model learning process, provided it is sufficiently slow to effectively average out fluctuations. The interactions among local ensembles of grid units serve solely to stabilize a common grid orientation. Spatial information, in the real mEC network, may be provided by any combination of feedforward cortical afferents and feedback hippocampal projections from place cells, since either input alone is likely sufficient to yield grid fields. PMID:19021261

  3. Big innovations in a small instrument: technical challenges in a new CCD system design for the Automated Patrol Telescope

    NASA Astrophysics Data System (ADS)

    Miziarski, Stan; Ashley, Michael C. B.; Smith, Greg; Barden, Sam; Dawson, John; Horton, Anthony; Saunders, Will; Brzeski, Jurek; Churilov, Vladimir; Klauser, Urs; Waller, Lew; Mayfield, Don; Correll, David; Phillips, Andre; Whittard, Denis

    2008-07-01

    We describe the design of a new CCD system delivered to the Automated Patrol Telescope at Siding Springs NSW Australia operated by UNSW. A very fast beam (f/1) with a mosaic of two MITLL CCID-34 detectors placed only 1 mm behind the field flattener which also serves as the dewar window, have called for innovative engineering solutions. This paper describes the design and procedure of the field-flattener mounting, differential screw adjustable detector mount and dewar suspension on the external ring providing tip/tilt and focus adjustment.

  4. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  5. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.

    PubMed

    Vasdev, Neil; Collier, Thomas Lee

    2016-01-01

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer. PMID:27548189

  6. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer

    NASA Astrophysics Data System (ADS)

    Luiza Bondar, M.; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  7. Design and analysis of closed-loop decoder adaptation algorithms for brain-machine interfaces.

    PubMed

    Dangi, Siddharth; Orsborn, Amy L; Moorman, Helene G; Carmena, Jose M

    2013-07-01

    Closed-loop decoder adaptation (CLDA) is an emerging paradigm for achieving rapid performance improvements in online brain-machine interface (BMI) operation. Designing an effective CLDA algorithm requires making multiple important decisions, including choosing the timescale of adaptation, selecting which decoder parameters to adapt, crafting the corresponding update rules, and designing CLDA parameters. These design choices, combined with the specific settings of CLDA parameters, will directly affect the algorithm's ability to make decoder parameters converge to values that optimize performance. In this article, we present a general framework for the design and analysis of CLDA algorithms and support our results with experimental data of two monkeys performing a BMI task. First, we analyze and compare existing CLDA algorithms to highlight the importance of four critical design elements: the adaptation timescale, selective parameter adaptation, smooth decoder updates, and intuitive CLDA parameters. Second, we introduce mathematical convergence analysis using measures such as mean-squared error and KL divergence as a useful paradigm for evaluating the convergence properties of a prototype CLDA algorithm before experimental testing. By applying these measures to an existing CLDA algorithm, we demonstrate that our convergence analysis is an effective analytical tool that can ultimately inform and improve the design of CLDA algorithms. PMID:23607558

  8. On Adaptive Extended Compatibility Changing Type of Product Design Strategy

    NASA Astrophysics Data System (ADS)

    Wenwen, Jiang; Zhibin, Xie

    The article uses research ways of Enterprise localization and enterprise's development course to research strategy of company's product design and development. It announces at different stages for development, different kinds of enterprises will adopt product design and development policies of different modes. It also announces close causality between development course of company and central technology and product. The result indicated enterprises in leading position in market, technology and brand adopt pioneer strategy type of product research and development. These enterprise relying on the large-scale leading enterprise offering a complete set service adopts the passively duplicating type tactic of product research and development. Some enterprise in part of advantage in technology, market, management or brand adopt following up strategy of product research and development. The enterprises with relative advantage position adopt the strategy of technology applied taking optimizing services as centre in product research and development in fields of brand culture and market service.

  9. A new approach for designing self-organizing systems and application to adaptive control

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Zhang, Shi; Lin, Yueqing; Huang, Song

    1993-01-01

    There is tremendous interest in the design of intelligent machines capable of autonomous learning and skillful performance under complex environments. A major task in designing such systems is to make the system plastic and adaptive when presented with new and useful information and stable in response to irrelevant events. A great body of knowledge, based on neuro-physiological concepts, has evolved as a possible solution to this problem. Adaptive resonance theory (ART) is a classical example under this category. The system dynamics of an ART network is described by a set of differential equations with nonlinear functions. An approach for designing self-organizing networks characterized by nonlinear differential equations is proposed.

  10. Group-Work in the Design of Complex Adaptive Learning Strategies

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    This paper presents a case study where twelve graduate students undertook the demanding role of the adaptive e-course developer and worked collaboratively on an authentic and complex design task in the context of open and distance tertiary education. The students had to work in groups in order to conceptualise and design a learning scenario for…

  11. Activity Structures for Project-Based Teaching and Learning: Design and Adaptation of Cultural Tools.

    ERIC Educational Resources Information Center

    Polman, Joseph L.

    This paper discusses research on activity structure design in a project-based science classroom and efforts to adapt designs from this setting to an after-school program involving historical inquiry. Common activity structures such as classroom lessons and Initiation-Reply-Evaluation (I-R-E) sequences are important cultural tools that help…

  12. A Framework for Adaptive Learning Design in a Web-Conferencing Environment

    ERIC Educational Resources Information Center

    Bower, Matt

    2016-01-01

    Many recent technologies provide the ability to dynamically adjust the interface depending on the emerging cognitive and collaborative needs of the learning episode. This means that educators can adaptively re-design the learning environment during the lesson, rather than purely relying on preemptive learning design thinking. Based on a…

  13. Systematic design and analysis of laser-guide-star adaptive-optics systems for large telescopes

    SciTech Connect

    Gavel, D.T.; Morris, J.R.; Vernon, R.G.

    1994-02-01

    The authors discuss the design of laser-guided adaptive-optics systems for the large, 8-10-m-class telescopes. Through proper choice of system components and optimized system design, the laser power that is needed at the astronomical site can be kept to a minimum. 37 refs., 9 figs., 3 tabs.

  14. Simple adaptive control system design for a quadrotor with an internal PFC

    SciTech Connect

    Mizumoto, Ikuro; Nakamura, Takuto; Kumon, Makoto; Takagi, Taro

    2014-12-10

    The paper deals with an adaptive control system design problem for a four rotor helicopter or quadrotor. A simple adaptive control design scheme with a parallel feedforward compensator (PFC) in the internal loop of the considered quadrotor will be proposed based on the backstepping strategy. As is well known, the backstepping control strategy is one of the advanced control strategy for nonlinear systems. However, the control algorithm will become complex if the system has higher order relative degrees. We will show that one can skip some design steps of the backstepping method by introducing a PFC in the inner loop of the considered quadrotor, so that the structure of the obtained controller will be simplified and a high gain based adaptive feedback control system will be designed. The effectiveness of the proposed method will be confirmed through numerical simulations.

  15. Simple adaptive control system design for a quadrotor with an internal PFC

    NASA Astrophysics Data System (ADS)

    Mizumoto, Ikuro; Nakamura, Takuto; Kumon, Makoto; Takagi, Taro

    2014-12-01

    The paper deals with an adaptive control system design problem for a four rotor helicopter or quadrotor. A simple adaptive control design scheme with a parallel feedforward compensator (PFC) in the internal loop of the considered quadrotor will be proposed based on the backstepping strategy. As is well known, the backstepping control strategy is one of the advanced control strategy for nonlinear systems. However, the control algorithm will become complex if the system has higher order relative degrees. We will show that one can skip some design steps of the backstepping method by introducing a PFC in the inner loop of the considered quadrotor, so that the structure of the obtained controller will be simplified and a high gain based adaptive feedback control system will be designed. The effectiveness of the proposed method will be confirmed through numerical simulations.

  16. Fuzzy Adaptive Control Design and Discretization for a Class of Nonlinear Uncertain Systems.

    PubMed

    Zhao, Xudong; Shi, Peng; Zheng, Xiaolong

    2016-06-01

    In this paper, tracking control problems are investigated for a class of uncertain nonlinear systems in lower triangular form. First, a state-feedback controller is designed by using adaptive backstepping technique and the universal approximation ability of fuzzy logic systems. During the design procedure, a developed method with less computation is proposed by constructing one maximum adaptive parameter. Furthermore, adaptive controllers with nonsymmetric dead-zone are also designed for the systems. Then, a sampled-data control scheme is presented to discretize the obtained continuous-time controller by using the forward Euler method. It is shown that both proposed continuous and discrete controllers can ensure that the system output tracks the target signal with a small bounded error and the other closed-loop signals remain bounded. Two simulation examples are presented to verify the effectiveness and applicability of the proposed new design techniques. PMID:26208376

  17. Design of an Automated Essay Grading (AEG) System in Indian Context

    ERIC Educational Resources Information Center

    Ghosh, Siddhartha; Fatima, Sameen S.

    2007-01-01

    Automated essay grading or scoring systems are no more a myth, but they are a reality. As of today, the human written (not hand written) essays are corrected not only by examiners/teachers but also by machines. The TOEFL exam is one of the best examples of this application. The students' essays are evaluated both by human and web based automated…

  18. Automated Analysis of the Digitized Second Palomar Sky Survey: System Design, Implementation, and Initial Results

    NASA Astrophysics Data System (ADS)

    Weir, Nicholas

    1995-01-01

    We describe the design, implementation, and initial scientific results of a system for analyzing the Digitized Second Palomar Observatory Sky Survey (DPOSS). The system (SKICAT) facilitates and largely automates the pipeline processing of DPOSS from raw pixel data into calibrated, classified object catalog form. A fundamental constraint limiting the scientific usefulness of optical imaging surveys is the level at which objects may be reliably distinguished as stars, galaxies, or artifacts. The classifier implemented within SKICAT was created using a new machine learning technology, whereby an algorithm determines a near-optimal set of classification rules based upon training examples. Using this approach, we were able to construct a classifier which distinguishes objects to the same level of accuracy as in previous surveys using comparable plate material, but nearly one magnitude fainter (or an equivalent BJ ~ 21.0). Our first analysis of DPOSS using SKICAT is of an overlapping set of four survey fields near the North Galactic Pole, in both the J and F passbands. Through detailed simulations of a subset of these data, we were able to analyze systematic aspects of our detection and measurement procedures, as well as optimize them. We discuss how we calibrate the plate magnitudes to the Gunn-Thuan g and r photometric system using CCD sequences obtained in a program devoted expressly to calibrating DPOSS. Our technique results in an estimated plate-to-plate zero point standard error of under 0.10m in g and below 0.05^{m } in r, for J and F plates, respectively. Using the catalogs derived from these fields, we compare our differential galaxy counts in g and r with those from recent Schmidt plate surveys as well as predictions from evolutionary and non-evolutionary (NE) galaxy models. We find generally good agreement between our counts and recent NE and mild evolutionary models calibrated to consistently fit bright and faint galaxy counts, colors, and redshift

  19. Equipment development for automated assembly of solar modules

    NASA Technical Reports Server (NTRS)

    Hagerty, J. J.

    1982-01-01

    Prototype equipment was developed which allows for totally automated assembly in the three major areas of module manufacture: cell stringing, encapsulant layup and cure and edge sealing. The equipment is designed to be used in conjunction with a standard Unimate 2000B industrial robot although the design is adaptable to other transport systems.

  20. Adaption of a fragment analysis technique to an automated high-throughput multicapillary electrophoresis device for the precise qualitative and quantitative characterization of microbial communities.

    PubMed

    Trotha, René; Reichl, Udo; Thies, Frank L; Sperling, Danuta; König, Wolfgang; König, Brigitte

    2002-04-01

    The analysis of microbial communities is of increasing importance in life sciences and bioengineering. Traditional techniques of investigations like culture or cloning methods suffer from many disadvantages. They are unable to give a complete qualitative and quantitative view of the total amount of microorganisms themselves, their interactions among each other and with their environment. Obviously, the determination of static or dynamic balances among microorganisms is of fast growing interest. The generation of species specific and fluorescently labeled 16S ribosomal DNA (rDNA) fragments by the terminal restriction fragment length polymorphism (T-RFLP) technique is a suitable tool to overcome the problems other methods have. For the separation of these fragments polyacrylamide gel sequencers are preferred as compared to capillary sequencers using linear polymers until now because of their higher electrophoretic resolution and therefore sizing accuracy. But modern capillary sequencers, especially multicapillary sequencers, offer an advanced grade of automation and an increased throughput necessary for the investigation of complex communities in long-time studies. Therefore, we adapted a T-RFLP technique to an automated high-throughput multicapillary electrophoresis device (ABI 3100 Genetic Analysis) with regard to a precise qualitative and quantitative characterization of microbial communities. PMID:11981854

  1. Design and Preliminary Testing of the International Docking Adapter's Peripheral Docking Target

    NASA Technical Reports Server (NTRS)

    Foster, Christopher W.; Blaschak, Johnathan; Eldridge, Erin A.; Brazzel, Jack P.; Spehar, Peter T.

    2015-01-01

    The International Docking Adapter's Peripheral Docking Target (PDT) was designed to allow a docking spacecraft to judge its alignment relative to the docking system. The PDT was designed to be compatible with relative sensors using visible cameras, thermal imagers, or Light Detection and Ranging (LIDAR) technologies. The conceptual design team tested prototype designs and materials to determine the contrast requirements for the features. This paper will discuss the design of the PDT, the methodology and results of the tests, and the conclusions pertaining to PDT design that were drawn from testing.

  2. Multivariable output feedback robust adaptive tracking control design for a class of delayed systems

    NASA Astrophysics Data System (ADS)

    Mirkin, Boris; Gutman, Per-Olof

    2015-02-01

    In this paper, we develop a model reference adaptive control scheme for a class of multi-input multi-output nonlinearly perturbed dynamic systems with unknown time-varying state delays which is also robust with respect to an external disturbance with unknown bounds. The output feedback adaptive control scheme uses feedback actions only, and thus does not require a direct measurement of the command or disturbance signals. A suitable Lyapunov-Krasovskii type functional is introduced to design the adaptation algorithms and to prove stability.

  3. On Adaptive Extended Different Life Cycle of Product Design Strategy

    NASA Astrophysics Data System (ADS)

    Wenwen, Jiang; Zhibin, Xie

    The article uses research ways of following the whole lifespan of product and enterprise's development course to research strategy of company's product design and development. It announces enterprises of different nature, enterprises at different developing stage will adopt different mode strategy. It also announces close causality between development course of company and central technology and product. The result indicated in different developing stages such as company development period, crisis predicament period, lasting steadies period, improving by payback period, issues steadies secondary period, declining go and live period, enterprise should pursue different mode product tactics of research and development such as shrinking strategy, consolidating strategy, innovation keeping forging ahead strategy. Enterprise should break regular management mode to introduce different research and development mode to promote enterprise's competitiveness effectively.

  4. Low Level Waste Conceptual Design Adaption to Poor Geological Conditions

    SciTech Connect

    Bell, J.; Drimmer, D.; Giovannini, A.; Manfroy, P.; Maquet, F.; Schittekat, J.; Van Cotthem, A.; Van Echelpoel, E.

    2002-02-26

    Since the early eighties, several studies have been carried out in Belgium with respect to a repository for the final disposal of low-level radioactive waste (LLW). In 1998, the Belgian Government decided to restrict future investigations to the four existing nuclear sites in Belgium or sites that might show interest. So far, only two existing nuclear sites have been thoroughly investigated from a geological and hydrogeological point of view. These sites are located in the North-East (Mol-Dessel) and in the mid part (Fleurus-Farciennes) of the country. Both sites have the disadvantage of presenting poor geological and hydrogeological conditions, which are rather unfavorable to accommodate a surface disposal facility for LLW. The underground of the Mol-Dessel site consists of neogene sand layers of about 180 m thick which cover a 100 meters thick clay layer. These neogene sands contain, at 20 m depth, a thin clayey layer. The groundwater level is quite close to the surface (0-2m) and finally, the topography is almost totally flat. The upper layer of the Fleurus-Farciennes site consists of 10 m silt with poor geomechanical characteristics, overlying sands (only a few meters thick) and Westphalian shales between 15 and 20 m depth. The Westphalian shales are tectonized and strongly weathered. In the past, coal seams were mined out. This activity induced locally important surface subsidence. For both nuclear sites that were investigated, a conceptual design was made that could allow any unfavorable geological or hydrogeological conditions of the site to be overcome. In Fleurus-Farciennes, for instance, the proposed conceptual design of the repository is quite original. It is composed of a shallow, buried concrete cylinder, surrounded by an accessible concrete ring, which allows permanent inspection and control during the whole lifetime of the repository. Stability and drainage systems should be independent of potential differential settlements an d subsidences

  5. Adaptation of NASA technology for the optimum design of orthopedic knee implants.

    PubMed

    Saravanos, D A; Mraz, P J; Davy, D T; Hopkins, D A

    1991-03-01

    NASA technology originally developed for designing aircraft turbine-engine blades has been adapted and applied to orthopedic knee implants. This article describes a method for tailoring an implant for optimal interaction with the environment of the tibia. The implant components are designed to control stresses in the bone for minimizing bone degradation and preventing failures. Engineers expect the tailoring system to improve knee prosthesis design and allow customized implants for individual patients. PMID:10150099

  6. Design of adaptive steganographic schemes for digital images

    NASA Astrophysics Data System (ADS)

    Filler, Tomás; Fridrich, Jessica

    2011-02-01

    Most steganographic schemes for real digital media embed messages by minimizing a suitably defined distortion function. In practice, this is often realized by syndrome codes which offer near-optimal rate-distortion performance. However, the distortion functions are designed heuristically and the resulting steganographic algorithms are thus suboptimal. In this paper, we present a practical framework for optimizing the parameters of additive distortion functions to minimize statistical detectability. We apply the framework to digital images in both spatial and DCT domain by first defining a rich parametric model which assigns a cost of making a change at every cover element based on its neighborhood. Then, we present a practical method for optimizing the parameters with respect to a chosen detection metric and feature space. We show that the size of the margin between support vectors in soft-margin SVMs leads to a fast detection metric and that methods minimizing the margin tend to be more secure w.r.t. blind steganalysis. The parameters obtained by the Nelder-Mead simplex-reflection algorithm for spatial and DCT-domain images are presented and the new embedding methods are tested by blind steganalyzers utilizing various feature sets. Experimental results show that as few as 80 images are sufficient for obtaining good candidates for parameters of the cost model, which allows us to speed up the parameter search.

  7. Evolutionary body building: adaptive physical designs for robots.

    PubMed

    Funes, P; Pollack, J

    1998-01-01

    Creating artificial life forms through evolutionary robotics faces a "chicken and egg" problem: Learning to control a complex body is dominated by problems specific to its sensors and effectors, while building a body that is controllable assumes the pre-existence of a brain. The idea of coevolution of bodies and brains is becoming popular, but little work has been done in evolution of physical structure because of the lack of a general framework for doing it. Evolution of creatures in simulation has usually resulted in virtual entities that are not buildable, while embodied evolution in actual robotics is constrained by the slow pace of real time. The work we present takes a step in addressing the problem of body evolution by applying evolutionary techniques to the design of structures assembled out of elementary components that stick together. Evolution takes place in a simulator that computes forces and stresses and predicts stability of three-dimensional brick structures. The final printout of our program is a schematic assembly, which is then built physically. We demonstrate the functionality of this approach to robot body building with many evolved artifacts. PMID:10352237

  8. Adaptive Clinical Trial Design: An Overview and Potential Applications in Dermatology.

    PubMed

    Elman, Scott A; Ware, James H; Gottlieb, Alice B; Merola, Joseph F

    2016-07-01

    The challenges of drug development, including increasing costs, late-stage drug failures, and the decline in the number of drugs being approved by the US Food and Drug Administration over time, have generated interest in adaptive study designs that have the potential to address these problems. Adaptive trial designs use interim data analysis to amend trials, and have been recognized for more than a decade as a way to increase trial efficiency, partly by the increased probability of demonstrating a drug effect if one exists. In this article, we define adaptive trials; give examples of the most common types; highlight the pros, cons, and ethical considerations of these designs; and illustrate how these tools can be applied to drug development in dermatology. PMID:27157773

  9. Implementation of an Automated Grading System with an Adaptive Learning Component to Affect Student Feedback and Response Time

    ERIC Educational Resources Information Center

    Matthews, Kevin; Janicki, Thomas; He, Ling; Patterson, Laurie

    2012-01-01

    This research focuses on the development and implementation of an adaptive learning and grading system with a goal to increase the effectiveness and quality of feedback to students. By utilizing various concepts from established learning theories, the goal of this research is to improve the quantity, quality, and speed of feedback as it pertains…

  10. A Bayesian decision-theoretic sequential response-adaptive randomization design.

    PubMed

    Jiang, Fei; Jack Lee, J; Müller, Peter

    2013-05-30

    We propose a class of phase II clinical trial designs with sequential stopping and adaptive treatment allocation to evaluate treatment efficacy. Our work is based on two-arm (control and experimental treatment) designs with binary endpoints. Our overall goal is to construct more efficient and ethical randomized phase II trials by reducing the average sample sizes and increasing the percentage of patients assigned to the better treatment arms of the trials. The designs combine the Bayesian decision-theoretic sequential approach with adaptive randomization procedures in order to achieve simultaneous goals of improved efficiency and ethics. The design parameters represent the costs of different decisions, for example, the decisions for stopping or continuing the trials. The parameters enable us to incorporate the actual costs of the decisions in practice. The proposed designs allow the clinical trials to stop early for either efficacy or futility. Furthermore, the designs assign more patients to better treatment arms by applying adaptive randomization procedures. We develop an algorithm based on the constrained backward induction and forward simulation to implement the designs. The algorithm overcomes the computational difficulty of the backward induction method, thereby making our approach practicable. The designs result in trials with desirable operating characteristics under the simulated settings. Moreover, the designs are robust with respect to the response rate of the control group. PMID:23315678

  11. An expert system for choosing the best combination of options in a general-purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Barthelemy, J. F. M.

    1985-01-01

    An expert system was developed to aid a user of the Automated Design Synthesis (ADS) general-purpose optimization computer program in selecting the best combination of strategy, optimizer, and one-dimensional search options for solving a problem. There are approximately 100 such combinations available in ADS. The knowledge base contains over 200 rules, and is divided into three categories: constrained problems, unconstrained problems, and constrained problems treated as unconstrained problems. The inference engine is written in LISP and is available on DEC-VAX and IBM PC/XT computers.

  12. Design and test of a semi-automated system for metrological verification of non-contact clinic thermometers

    NASA Astrophysics Data System (ADS)

    Giannetti, R., Dr; Sáenz-Nuño, M. A., Dr; Valderrama, J. M.; Fernandez, A.

    2013-09-01

    Clinic thermometers are, probably, the most used measurement instrument in the medical facilities (hospitals, clinics, etc.) all around the world. A good part of the assessment the physician does on the patient's health status will depend on the result of such a measurement. In this work, a system to assess the quality of non-contact clinic thermometers is developed and presented; the accuracy of the system is designed to be a useful tool in the phase of the instrument verification and as a base for a future automated calibration facilities.

  13. First-order design of off-axis reflective ophthalmic adaptive optics systems using afocal telescopes.

    PubMed

    Gómez-Vieyra, Armando; Dubra, Alfredo; Malacara-Hernández, Daniel; Williams, David R

    2009-10-12

    Expressions for minimal astigmatism in image and pupil planes in off-axis afocal reflective telescopes formed by pairs of spherical mirrors are presented. These formulae which are derived from the marginal ray fan equation can be used for designing laser cavities, spectrographs and adaptive optics retinal imaging systems. The use, range and validity of these formulae are limited by spherical aberration and coma for small and large angles respectively. This is discussed using examples from adaptive optics retinal imaging systems. The performance of the resulting optical designs are evaluated and compared against the configurations with minimal wavefront RMS, using the defocus-corrected wavefront RMS as a metric. PMID:20372626

  14. Multiobjective control design including performance robustness for gust alleviation of a wing with adaptive material actuators

    NASA Astrophysics Data System (ADS)

    Layton, Jeffrey B.

    1997-06-01

    The goal of this paper is to examine the use of covariance control to directly design reduced-order multi-objective controllers for gust alleviation using adaptive materials as the control effector. It will use piezoelectric actuators as control effectors in a finite element model of a full-size wing model. More precisely, the finite element model is of the F-16 Agile Falcon/Active Flexible Wing that is modified to use piezoelectric actuators as control effectors. The paper will also examine the interacting roles of important control design constraints and objectives for designing an aeroservoelastic system. The paper will also present some results of multiobjective control design for the model, illustrating the benefits and complexity of modern practical control design for aeroservoelastic systems that use adaptive materials for actuation.

  15. Design of a Model Reference Adaptive Controller for an Unmanned Air Vehicle

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Matsutani, Megumi; Annaswamy, Anuradha M.

    2010-01-01

    This paper presents the "Adaptive Control Technology for Safe Flight (ACTS)" architecture, which consists of a non-adaptive controller that provides satisfactory performance under nominal flying conditions, and an adaptive controller that provides robustness under off nominal ones. The design and implementation procedures of both controllers are presented. The aim of these procedures, which encompass both theoretical and practical considerations, is to develop a controller suitable for flight. The ACTS architecture is applied to the Generic Transport Model developed by NASA-Langley Research Center. The GTM is a dynamically scaled test model of a transport aircraft for which a flight-test article and a high-fidelity simulation are available. The nominal controller at the core of the ACTS architecture has a multivariable LQR-PI structure while the adaptive one has a direct, model reference structure. The main control surfaces as well as the throttles are used as control inputs. The inclusion of the latter alleviates the pilot s workload by eliminating the need for cancelling the pitch coupling generated by changes in thrust. Furthermore, the independent usage of the throttles by the adaptive controller enables their use for attitude control. Advantages and potential drawbacks of adaptation are demonstrated by performing high fidelity simulations of a flight-validated controller and of its adaptive augmentation.

  16. Adaptive combinatorial design to explore large experimental spaces: approach and validation.

    PubMed

    Lejay, L V; Shasha, D E; Palenchar, P M; Kouranov, A Y; Cruikshank, A A; Chou, M F; Coruzzi, G M

    2004-12-01

    Systems biology requires mathematical tools not only to analyse large genomic datasets, but also to explore large experimental spaces in a systematic yet economical way. We demonstrate that two-factor combinatorial design (CD), shown to be useful in software testing, can be used to design a small set of experiments that would allow biologists to explore larger experimental spaces. Further, the results of an initial set of experiments can be used to seed further 'Adaptive' CD experimental designs. As a proof of principle, we demonstrate the usefulness of this Adaptive CD approach by analysing data from the effects of six binary inputs on the regulation of genes in the N-assimilation pathway of Arabidopsis. This CD approach identified the more important regulatory signals previously discovered by traditional experiments using far fewer experiments, and also identified examples of input interactions previously unknown. Tests using simulated data show that Adaptive CD suffers from fewer false positives than traditional experimental designs in determining decisive inputs, and succeeds far more often than traditional or random experimental designs in determining when genes are regulated by input interactions. We conclude that Adaptive CD offers an economical framework for discovering dominant inputs and interactions that affect different aspects of genomic outputs and organismal responses. PMID:17051692

  17. A new adaptive algorithm for automated feature extraction in exponentially damped signals for health monitoring of smart structures

    NASA Astrophysics Data System (ADS)

    Qarib, Hossein; Adeli, Hojjat

    2015-12-01

    In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.

  18. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  19. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  20. Conceptual design for a user-friendly adaptive optics system at Lick Observatory

    SciTech Connect

    Bissinger, H.D.; Olivier, S.; Max, C.

    1996-03-08

    In this paper, we present a conceptual design for a general-purpose adaptive optics system, usable with all Cassegrain facility instruments on the 3 meter Shane telescope at the University of California`s Lick Observatory located on Mt. Hamilton near San Jose, California. The overall design goal for this system is to take the sodium-layer laser guide star adaptive optics technology out of the demonstration stage and to build a user-friendly astronomical tool. The emphasis will be on ease of calibration, improved stability and operational simplicity in order to allow the system to be run routinely by observatory staff. A prototype adaptive optics system and a 20 watt sodium-layer laser guide star system have already been built at Lawrence Livermore National Laboratory for use at Lick Observatory. The design presented in this paper is for a next- generation adaptive optics system that extends the capabilities of the prototype system into the visible with more degrees of freedom. When coupled with a laser guide star system that is upgraded to a power matching the new adaptive optics system, the combined system will produce diffraction-limited images for near-IR cameras. Atmospheric correction at wavelengths of 0.6-1 mm will significantly increase the throughput of the most heavily used facility instrument at Lick, the Kast Spectrograph, and will allow it to operate with smaller slit widths and deeper limiting magnitudes. 8 refs., 2 figs.

  1. Covariate-adjusted response-adaptive designs for longitudinal treatment responses: PEMF trial revisited.

    PubMed

    Biswas, Atanu; Park, Eunsik; Bhattacharya, Rahul

    2012-08-01

    Response-adaptive designs have become popular for allocation of the entering patients among two or more competing treatments in a phase III clinical trial. Although there are a lot of designs for binary treatment responses, the number of designs involving covariates is very small. Sometimes the patients give repeated responses. The only available response-adaptive allocation design for repeated binary responses is the urn design by Biswas and Dewanji [Biswas A and Dewanji AA. Randomized longitudinal play-the-winner design for repeated binary data. ANZJS 2004; 46: 675-684; Biswas A and Dewanji A. Inference for a RPW-type clinical trial with repeated monitoring for the treatment of rheumatoid arthritis. Biometr J 2004; 46: 769-779.], although it does not take care of the covariates of the patients in the allocation design. In this article, a covariate-adjusted response-adaptive randomisation procedure is developed using the log-odds ratio within the Bayesian framework for longitudinal binary responses. The small sample performance of the proposed allocation procedure is assessed through a simulation study. The proposed procedure is illustrated using some real data set. PMID:20974667

  2. Nonlinear adaptive control systems design of BTT missile based on fully tuned RBF neural networks

    NASA Astrophysics Data System (ADS)

    Hu, Yunan; Jin, Yuqiang; Li, Jing

    2003-09-01

    Based on fully tuned RBF neural networks and backstepping control techniques, a novel nonlinear adaptive control scheme is proposed for missile control systems with a general set of uncertainties. The effect of the uncertainties is synthesized one term in the design procedure. Then RBF neural networks are used to eliminate its effect. The nonlinear adaptive controller is designed using backstepping control techniques. The control problem is resolved while the control coefficient matrix is unknown. The adaptive tuning rules for updating all of the parameters of the fully tuned RBF neural networks are firstly derived by the Lyapunov stability theorem. Finally, nonlinear 6-DOF numerical simulation results for a BTT missile model are presented to demonstrate the effectiveness of the proposed method.

  3. Adaptive design for digital nonlinear autopilot of ship-to-ship missiles

    NASA Astrophysics Data System (ADS)

    Im, Ki Hong; Chaw, Dongkyoung; Choi, Jin Young

    2005-12-01

    This paper proposes a practical design method for ship-to-ship missiles' autopilot. When the pre-designed analogue autopilot is implemented in digital way, they generally suffer from severe performance degradation and instability problem even for a sufficiently small sampling time. Also, aerodynamic uncertainties can affect the overall stability and this happens more severely when the nonlinear autopilot is digitally implemented. In order to realize a practical autopilot, two main issues, digital implementation problem and compensation for the aerodynamic uncertainties, are considered in this paper. MIMO (multi-input multi-output) nonlinear autopilot is presented first and the input and output of the missile are discretized for implementation. In this step, the discretization effect is compensated by designing an additional control input. Finally, we design a parameter adaptation law to compensate the control performance. Stability analysis and 6-DOF (degree-of-freedom) simulations are presented to verify the proposed adaptive autopilot.

  4. Application of positive-real functions in hyperstable discrete model-reference adaptive system design.

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.

    1972-01-01

    Proposal of an algorithmic procedure, based on mathematical programming methods, to design compensators for hyperstable discrete model-reference adaptive systems (MRAS). The objective of the compensator is to render the MRAS insensitive to initial parameter estimates within a maximized hypercube in the model parameter space.

  5. Transient analysis of an adaptive system for optimization of design parameters

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.

    1992-01-01

    Averaging methods are applied to analyzing and optimizing the transient response associated with the direct adaptive control of an oscillatory second-order minimum-phase system. The analytical design methods developed for a second-order plant can be applied with some approximation to a MIMO flexible structure having a single dominant mode.

  6. Direct and Inverse Problems of Item Pool Design for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Belov, Dmitry I.; Armstrong, Ronald D.

    2009-01-01

    The recent literature on computerized adaptive testing (CAT) has developed methods for creating CAT item pools from a large master pool. Each CAT pool is designed as a set of nonoverlapping forms reflecting the skill levels of an assumed population of test takers. This article presents a Monte Carlo method to obtain these CAT pools and discusses…

  7. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  8. Can Approaches to Research in Art and Design Be Beneficially Adapted for Research into Higher Education?

    ERIC Educational Resources Information Center

    Trowler, Paul

    2013-01-01

    This paper examines the research practices in Art and Design that are distinctively different from those common in research into higher education outside those fields. It considers whether and what benefit could be derived from their adaptation by the latter. The paper also examines the factors that are conducive and obstructive to adaptive…

  9. The design and development of a two-dimensional adaptive truss structure

    NASA Technical Reports Server (NTRS)

    Kuwao, Fumihiro; Motohashi, Shoichi; Yoshihara, Makoto; Takahara, Kenichi; Natori, Michihiro

    1987-01-01

    The functional model of a two dimensional adaptive truss structure which can purposefully change its geometrical configuration is introduced. The details of design and fabrication such as kinematic analysis, dynamic characteristics analysis and some test results are presented for the demonstration of this two dimensional truss concept.

  10. An objective re-evaluation of adaptive sample size re-estimation: commentary on 'Twenty-five years of confirmatory adaptive designs'.

    PubMed

    Mehta, Cyrus; Liu, Lingyun

    2016-02-10

    Over the past 25 years, adaptive designs have gradually gained acceptance and are being used with increasing frequency in confirmatory clinical trials. Recent surveys of submissions to the regulatory agencies reveal that the most popular type of adaptation is unblinded sample size re-estimation. Concerns have nevertheless been raised that this type of adaptation is inefficient.We intend to show in our discussion that such concerns are greatly exaggerated in any practical setting and that the advantages of adaptive sample size re-estimation usually outweigh any minor loss of efficiency. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26757953

  11. A two-stage patient enrichment adaptive design in phase II oncology trials.

    PubMed

    Song, James X

    2014-01-01

    Illustrated is the use of a patient enrichment adaptive design in a randomized phase II trial which allows the evaluation of treatment benefits by the biomarker expression level and makes interim adjustment according to the pre-specified rules. The design was applied to an actual phase II metastatic hepatocellular carcinoma (HCC) trial in which progression-free survival (PFS) in two biomarker-defined populations is evaluated at both interim and final analyses. As an extension, a short-term biomarker is used to predict the long-term PFS in a Bayesian model in order to improve the precision of hazard ratio (HR) estimate at the interim analysis. The characteristics of the extended design are examined in a number of scenarios via simulations. The recommended adaptive design is shown to be useful in a phase II setting. When a short-term maker which correlates with the long-term PFS is available, the design can be applied in smaller early phase trials in which PFS requires longer follow-up. In summary, the adaptive design offers flexibility in randomized phase II patient enrichment trials and should be considered in an overall personalized healthcare (PHC) strategy. PMID:24342820

  12. An automated instrument for human STR identification: design, characterization, and experimental validation.

    PubMed

    Hurth, Cedric; Smith, Stanley D; Nordquist, Alan R; Lenigk, Ralf; Duane, Brett; Nguyen, David; Surve, Amol; Hopwood, Andrew J; Estes, Matthew D; Yang, Jianing; Cai, Zhi; Chen, Xiaojia; Lee-Edghill, John G; Moran, Nina; Elliott, Keith; Tully, Gillian; Zenhausern, Frederic

    2010-10-01

    The microfluidic integration of an entire DNA analysis workflow on a fully integrated miniaturized instrument is reported using lab-on-a-chip automation to perform DNA fingerprinting compatible with CODIS standard relevant to the forensic community. The instrument aims to improve the cost, duration, and ease of use to perform a "sample-to-profile" analysis with no need for human intervention. The present publication describes the operation of the three major components of the system: the electronic control components, the microfluidic cartridge and CE microchip, and the optical excitation/detection module. Experimental details are given to characterize the level of performance, stability, reliability, accuracy, and sensitivity of the prototype system. A typical temperature profile from a PCR amplification process and an electropherogram of a commercial size standard (GeneScan 500™, Applied Biosystems) separation are shown to assess the relevance of the instrument to forensic applications. Finally, we present a profile from an automated integrated run where lysed cells from a buccal swab were introduced in the system and no further human intervention was required to complete the analysis. PMID:20931618

  13. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    SciTech Connect

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  14. Analysis and design of a high power laser adaptive phased array transmitter

    NASA Technical Reports Server (NTRS)

    Mevers, G. E.; Soohoo, J. F.; Winocur, J.; Massie, N. A.; Southwell, W. H.; Brandewie, R. A.; Hayes, C. L.

    1977-01-01

    The feasibility of delivering substantial quantities of optical power to a satellite in low earth orbit from a ground based high energy laser (HEL) coupled to an adaptive antenna was investigated. Diffraction effects, atmospheric transmission efficiency, adaptive compensation for atmospheric turbulence effects, including the servo bandwidth requirements for this correction, and the adaptive compensation for thermal blooming were examined. To evaluate possible HEL sources, atmospheric investigations were performed for the CO2, (C-12)(O-18)2 isotope, CO and DF wavelengths using output antenna locations of both sea level and mountain top. Results indicate that both excellent atmospheric and adaption efficiency can be obtained for mountain top operation with a micron isotope laser operating at 9.1 um, or a CO laser operating single line (P10) at about 5.0 (C-12)(O-18)2um, which was a close second in the evaluation. Four adaptive power transmitter system concepts were generated and evaluated, based on overall system efficiency, reliability, size and weight, advanced technology requirements and potential cost. A multiple source phased array was selected for detailed conceptual design. The system uses a unique adaption technique of phase locking independent laser oscillators which allows it to be both relatively inexpensive and most reliable with a predicted overall power transfer efficiency of 53%.

  15. Design of artificial genetic regulatory networks with multiple delayed adaptive responses*

    NASA Astrophysics Data System (ADS)

    Kaluza, Pablo; Inoue, Masayo

    2016-06-01

    Genetic regulatory networks with adaptive responses are widely studied in biology. Usually, models consisting only of a few nodes have been considered. They present one input receptor for activation and one output node where the adaptive response is computed. In this work, we design genetic regulatory networks with many receptors and many output nodes able to produce delayed adaptive responses. This design is performed by using an evolutionary algorithm of mutations and selections that minimizes an error function defined by the adaptive response in signal shapes. We present several examples of network constructions with a predefined required set of adaptive delayed responses. We show that an output node can have different kinds of responses as a function of the activated receptor. Additionally, complex network structures are presented since processing nodes can be involved in several input-output pathways. Supplementary material in the form of one nets file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-70172-9

  16. GASICA: generic automated stress induction and control application design of an application for controlling the stress state

    PubMed Central

    van der Vijgh, Benny; Beun, Robbert J.; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms. PMID:25538554

  17. IMPACT OF CANAL DESIGN LIMITATIONS ON WATER DELIVERY OPERATIONS AND AUTOMATION

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Irrigation canals are often designed for water transmission. The design engineer simply ensures that the canal will pass the maximum design discharge. However, irrigation canals frequently operated far below design capacity. Because demands and the distribution of flow at bifurcations (branch points...

  18. Design and progress toward a multi-conjugate adaptive optics system for distributed aberration correction

    SciTech Connect

    Baker, K; Olivier, S; Tucker, J; Silva, D; Gavel, D; Lim, R; Gratrix, E

    2004-08-17

    This article investigates the use of a multi-conjugate adaptive optics system to improve the field-of-view for the system. The emphasis of this research is to develop techniques to improve the performance of optical systems with applications to horizontal imaging. The design and wave optics simulations of the proposed system are given. Preliminary results from the multi-conjugate adaptive optics system are also presented. The experimental system utilizes a liquid-crystal spatial light modulator and an interferometric wave-front sensor for correction and sensing of the phase aberrations, respectively.

  19. Design and inference for the intent-to-treat principle using adaptive treatment.

    PubMed

    Dawson, Ree; Lavori, Philip W

    2015-04-30

    Nonadherence to assigned treatment jeopardizes the power and interpretability of intent-to-treat comparisons from clinical trial data and continues to be an issue for effectiveness studies, despite their pragmatic emphasis. We posit that new approaches to design need to complement developments in methods for causal inference to address nonadherence, in both experimental and practice settings. This paper considers the conventional study design for psychiatric research and other medical contexts, in which subjects are randomized to treatments that are fixed throughout the trial and presents an alternative that converts the fixed treatments into an adaptive intervention that reflects best practice. The key element is the introduction of an adaptive decision point midway into the study to address a patient's reluctance to remain on treatment before completing a full-length trial of medication. The clinical uncertainty about the appropriate adaptation prompts a second randomization at the new decision point to evaluate relevant options. Additionally, the standard 'all-or-none' principal stratification (PS) framework is applied to the first stage of the design to address treatment discontinuation that occurs too early for a midtrial adaptation. Drawing upon the adaptive intervention features, we develop assumptions to identify the PS causal estimand and to introduce restrictions on outcome distributions to simplify expectation-maximization calculations. We evaluate the performance of the PS setup, with particular attention to the role played by a binary covariate. The results emphasize the importance of collecting covariate data for use in design and analysis. We consider the generality of our approach beyond the setting of psychiatric research. PMID:25581413

  20. Tools for Designing, Evaluating, and Certifying NextGen Technologies and Procedures: Automation Roles and Responsibilities

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.

    2011-01-01

    Barbara Kanki from NASA Ames Research Center will discuss research that focuses on the collaborations between pilots, air traffic controllers and dispatchers that will change in NextGen systems as automation increases and roles and responsibilities change. The approach taken by this NASA Ames team is to build a collaborative systems assessment template (CSAT) based on detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, policies, procedures and documented practices as augmented by field observations and interviews. The CSAT is developed to aid the assessment of key human factors and performance tradeoffs that result from considering different collaborative arrangements under NextGen system changes. In theory, the CSAT product may be applied to any NextGen application (such as Trajectory Based Operations) with specified ground and aircraft capabilities.

  1. Design and automated production of 11C-alpha-methyl-l-tryptophan (11C-AMT).

    PubMed

    Huang, Xuan; Xiao, Xia; Gillies, Robert J; Tian, Haibin

    2016-05-01

    (11)C-alpha-methyl-l-tryptophan ([(11)C]AMT), a tryptophan metabolism PET tracer, has successfully been employed for brain serotonin pathway and indoleamine 2,3-dioxygenase (IDO) pathway related tumor imaging. We here report a reliable, automated procedure for routine synthesis of [(11)C]AMT based on an Eckert and Ziegler Modular-Lab system. The semi-preparative HPLC was incorporated into the system to improve chemical purity and specific activity. The 6-step radiosynthesis followed by HPLC-purification provided [(11)C]AMT in 5.3±1.2% (n=6, non-decay-corrected) overall radiochemical yield with radiochemical purity >99% and specific activity of 35-116GBq/μmol. Usually, 2.95±0.65GBq (n=6, EOS) patient ready dose was produced from about 55.5GBq [(11)C]CO2 in 50min. PMID:27150033

  2. Design and demonstration of automated data analysis algorithms for ultrasonic inspection of complex composite panels with bonds

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2016-02-01

    To address the data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis (ADA) algorithms have been developed to make calls on indications that satisfy the detection criteria and minimize false calls. The original design followed standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. However, certain complex panels with varying shape, ply drops and the presence of bonds can complicate this interpretation process. In this paper, enhancements to the automated data analysis algorithms are introduced to address these challenges. To estimate the thickness of the part and presence of bonds without prior information, an algorithm tracks potential backwall or bond-line signals, and evaluates a combination of spatial, amplitude, and time-of-flight metrics to identify bonded sections. Once part boundaries, thickness transitions and bonded regions are identified, feature extraction algorithms are applied to multiple sets of through-thickness and backwall C-scan images, for evaluation of both first layer through thickness and layers under bonds. ADA processing results are presented for a variety of complex test specimens with inserted materials and other test discontinuities. Lastly, enhancements to the ADA software interface are presented, which improve the software usability for final data review by the inspectors and support the certification process.

  3. Adaptive sampling in two-phase designs: a biomarker study for progression in arthritis

    PubMed Central

    McIsaac, Michael A; Cook, Richard J

    2015-01-01

    Response-dependent two-phase designs are used increasingly often in epidemiological studies to ensure sampling strategies offer good statistical efficiency while working within resource constraints. Optimal response-dependent two-phase designs are difficult to implement, however, as they require specification of unknown parameters. We propose adaptive two-phase designs that exploit information from an internal pilot study to approximate the optimal sampling scheme for an analysis based on mean score estimating equations. The frequency properties of estimators arising from this design are assessed through simulation, and they are shown to be similar to those from optimal designs. The design procedure is then illustrated through application to a motivating biomarker study in an ongoing rheumatology research program. Copyright © 2015 © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25951124

  4. A holistic environment for the design and execution of self-adaptive clinical pathways.

    PubMed

    Alexandrou, Dimitrios Al; Skitsas, Ioannis E; Mentzas, Gregoris N

    2011-01-01

    One of the main challenges to be confronted by modern health care, so as to increase treatment quality, is the personalization of treatment. The treatment personalization requires the continuous reconfiguration and adaptation of the selected treatment schemes according to the "current" clinical status of each patient and "current" circumstances inside a health care organization that change rapidly, as well as the updated medical knowledge. In this paper, we present an innovative software environment that provides an integrated IT solution concerning the adaptation of health care processes (clinical pathways) during execution time. The software comprises a health care process execution engine assisted by a semantic infrastructure for reconfiguring the clinical pathways. During the execution of clinical pathways, the system reasons over the rules and reconfigures the next steps of the treatment. A graphical designer interface is implemented for the definition of the rule-set for the clinical pathways adaptation in a user-friendly way. PMID:20876028

  5. Evaluation of green infrastructure designs using the Automated Geospatial Watershed Assessment Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In arid and semi-arid regions, green infrastructure (GI) designs can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwater, addressi...

  6. Adapting computational optimization concepts from aeronautics to nuclear fusion reactor design

    NASA Astrophysics Data System (ADS)

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2012-10-01

    Even on the most powerful supercomputers available today, computational nuclear fusion reactor divertor design is extremely CPU demanding, not least due to the large number of design variables and the hybrid micro-macro character of the flows. Therefore, automated design methods based on optimization can greatly assist current reactor design studies. Over the past decades, "adjoint methods" for shape optimization have proven their virtue in the field of aerodynamics. Applications include drag reduction for wing and wing-body configurations. Here we demonstrate that also for divertor design, these optimization methods have a large potential. Specifically, we apply the continuous adjoint method to the optimization of the divertor geometry in a 2D poloidal cross section of an axisymmetric tokamak device (as, e.g., JET and ITER), using a simplified model for the plasma edge. The design objective is to spread the target material heat load as much as possible by controlling the shape of the divertor, while maintaining the full helium ash removal capabilities of the vacuum pumping system.

  7. An Integrated Systems Approach to Designing Climate Change Adaptation Policy in Water Resources

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Malano, H. M.; Davidson, B.; George, B.

    2014-12-01

    Climate change projections are characterised by large uncertainties with rainfall variability being the key challenge in designing adaptation policies. Climate change adaptation in water resources shows all the typical characteristics of 'wicked' problems typified by cognitive uncertainty as new scientific knowledge becomes available, problem instability, knowledge imperfection and strategic uncertainty due to institutional changes that inevitably occur over time. Planning that is characterised by uncertainties and instability requires an approach that can accommodate flexibility and adaptive capacity for decision-making. An ability to take corrective measures in the event that scenarios and responses envisaged initially derive into forms at some future stage. We present an integrated-multidisciplinary and comprehensive framework designed to interface and inform science and decision making in the formulation of water resource management strategies to deal with climate change in the Musi Catchment of Andhra Pradesh, India. At the core of this framework is a dialogue between stakeholders, decision makers and scientists to define a set of plausible responses to an ensemble of climate change scenarios derived from global climate modelling. The modelling framework used to evaluate the resulting combination of climate scenarios and adaptation responses includes the surface and groundwater assessment models (SWAT & MODFLOW) and the water allocation modelling (REALM) to determine the water security of each adaptation strategy. Three climate scenarios extracted from downscaled climate models were selected for evaluation together with four agreed responses—changing cropping patterns, increasing watershed development, changing the volume of groundwater extraction and improving irrigation efficiency. Water security in this context is represented by the combination of level of water availability and its associated security of supply for three economic activities (agriculture

  8. Design Framework for an Adaptive MOOC Enhanced by Blended Learning: Supplementary Training and Personalized Learning for Teacher Professional Development

    ERIC Educational Resources Information Center

    Gynther, Karsten

    2016-01-01

    The research project has developed a design framework for an adaptive MOOC that complements the MOOC format with blended learning. The design framework consists of a design model and a series of learning design principles which can be used to design in-service courses for teacher professional development. The framework has been evaluated by…

  9. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.

  10. Adaptive Blending of Model and Observations for Automated Short-Range Forecasting: Examples from the Vancouver 2010 Olympic and Paralympic Winter Games

    NASA Astrophysics Data System (ADS)

    Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti

    2014-01-01

    An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.

  11. A new adaptive merging and growing algorithm for designing artificial neural networks.

    PubMed

    Islam, Md Monirul; Sattar, Md Abdus; Amin, Md Faijul; Yao, Xin; Murase, Kazuyuki

    2009-06-01

    This paper presents a new algorithm, called adaptive merging and growing algorithm (AMGA), in designing artificial neural networks (ANNs). This algorithm merges and adds hidden neurons during the training process of ANNs. The merge operation introduced in AMGA is a kind of a mixed mode operation, which is equivalent to pruning two neurons and adding one neuron. Unlike most previous studies, AMGA puts emphasis on autonomous functioning in the design process of ANNs. This is the main reason why AMGA uses an adaptive not a predefined fixed strategy in designing ANNs. The adaptive strategy merges or adds hidden neurons based on the learning ability of hidden neurons or the training progress of ANNs. In order to reduce the amount of retraining after modifying ANN architectures, AMGA prunes hidden neurons by merging correlated hidden neurons and adds hidden neurons by splitting existing hidden neurons. The proposed AMGA has been tested on a number of benchmark problems in machine learning and ANNs, including breast cancer, Australian credit card assessment, and diabetes, gene, glass, heart, iris, and thyroid problems. The experimental results show that AMGA can design compact ANN architectures with good generalization ability compared to other algorithms. PMID:19203888

  12. Application-specific design of adaptive structures with piezoceramic patch actuators

    NASA Astrophysics Data System (ADS)

    Wierach, Peter; Monner, Hans P.; Schoenecker, Andreas; Duerr, Johannes K.

    2002-07-01

    The development of a new technology for the manufacturing of adaptive structures on the basis of thin monolithic peizoceramic wafers is an important goal of the German industrial project 'Adaptronik'. Partners from automotive-, space-, medical-, engineering- and optical industry participate in this project to enable new adaptive solutions for their applications. Due to the extreme brittleness of the piezoceramic material the manufacturing of these structures is still very demanding. Very often cracks in the piezoceramic material make the structure useless. This problem becomes serious when large scale structures with many actuators and sensor are considered. To come to more reliable results the use of encapsulated piezoceramic actuators and sensor came into focus. With respect to the great variety of different requirements given by the industrial partners the use of standardized solutions was not feasible. The goal was to develop new elements with improved performance parameters that can easily be adapted to different applications. Due to a modular concept, the developed multifunctional elements can be designed to meet a great variety of different structures was developed. A first step to adapt this technology to prototype structures has been done by the development of special encapsulated patches for an adaptive lightweight satellite mirror.

  13. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  14. Dynamic experiment design regularization approach to adaptive imaging with array radar/SAR sensor systems.

    PubMed

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the "model-free" variational analysis (VA)-based image enhancement approach and the "model-based" descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations. PMID:22163859

  15. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI

    NASA Astrophysics Data System (ADS)

    Akamatsu, G.; Ikari, Y.; Ohnishi, A.; Nishida, H.; Aita, K.; Sasaki, M.; Yamamoto, Y.; Sasaki, M.; Senda, M.

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer’s disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain 11C-PiB PET were examined. The 11C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The 11C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize 11C-PiB scans as positive or negative. Significant correlation was observed between the SUVRs

  16. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI.

    PubMed

    Akamatsu, G; Ikari, Y; Ohnishi, A; Nishida, H; Aita, K; Sasaki, M; Yamamoto, Y; Sasaki, M; Senda, M

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer's disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain (11)C-PiB PET were examined. The (11)C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The (11)C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize (11)C-PiB scans as positive or negative. Significant correlation was observed between the

  17. Automation and adaptation: Nurses' problem-solving behavior following the implementation of bar coded medication administration technology.

    PubMed

    Holden, Richard J; Rivera-Rodriguez, A Joy; Faye, Héléne; Scanlon, Matthew C; Karsh, Ben-Tzion

    2013-08-01

    The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses' operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA's impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians' work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign. PMID:24443642

  18. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol

    PubMed Central

    Azar, Kristen MJ; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-01

    Background In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Objective Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Methods Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. Results A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. Conclusions The randomized trial will provide rigorous evidence regarding the efficacy of

  19. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  20. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  1. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  2. Spacecraft automated operations. [for interplanetary missions

    NASA Technical Reports Server (NTRS)

    Bird, T. H.; Sharpe, B. L.

    1979-01-01

    Trends in automation of planetary spacecraft are examined using data from missions as far back as Mariner '67 and up to the highly sophisticated Galileo. Nine design considerations which influence the degree of automation such as protection against catastrophic failures, highly repetitive functions, loss of spacecraft communications, and the need for near-real-time adaptivity are discussed. Rapid growth of automation is shown in terms of on-board hardware by plots of number of processors on board, the average speed of processors, and total core memory. The number of commands transmitted from the ground has grown to 5 million bits in Voyager, so that increases in mission complexity have increased both in spacecraft automation and ground operations. Achieving greater automation by transferring ground operations to the spacecraft with the current means of controlling missions, are considered noting proposed changes. For the future, improved computer technology, more microprocessors and increased core storage will be used, and the number of automated functions and their complexity will grow. It is concluded that using the growing computational capability of spacecraft will achieve more autonomy thus reversing the trend of increased mission complexity and cost.

  3. Automated divertor target design by adjoint shape sensitivity analysis and a one-shot method

    SciTech Connect

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2014-12-01

    As magnetic confinement fusion progresses towards the development of first reactor-scale devices, computational tokamak divertor design is a topic of high priority. Presently, edge plasma codes are used in a forward approach, where magnetic field and divertor geometry are manually adjusted to meet design requirements. Due to the complex edge plasma flows and large number of design variables, this method is computationally very demanding. On the other hand, efficient optimization-based design strategies have been developed in computational aerodynamics and fluid mechanics. Such an optimization approach to divertor target shape design is elaborated in the present paper. A general formulation of the design problems is given, and conditions characterizing the optimal designs are formulated. Using a continuous adjoint framework, design sensitivities can be computed at a cost of only two edge plasma simulations, independent of the number of design variables. Furthermore, by using a one-shot method the entire optimization problem can be solved at an equivalent cost of only a few forward simulations. The methodology is applied to target shape design for uniform power load, in simplified edge plasma geometry.

  4. Local Laser Strengthening of Steel Sheets for Load Adapted Component Design in Car Body Structures

    NASA Astrophysics Data System (ADS)

    Jahn, Axel; Heitmanek, Marco; Standfuss, Jens; Brenner, Berndt; Wunderlich, Gerd; Donat, Bernd

    The current trend in car body construction concerning light weight design and car safety improvement increasingly requires an adaption of the local material properties on the component load. Martensitic hardenable steels, which are typically used in car body components, show a significant hardening effect, for instance in laser welded seams. This effect can be purposefully used as a local strengthening method. For several steel grades the local strengthening, resulting from a laser remelting process was investigated. The strength in the treated zone was determined at crash relevant strain rates. A load adapted design of complex reinforcement structures was developed for compression and bending loaded tube samples, using numerical simulation of the deformation behavior. Especially for bending loaded parts, the crash energy absorption can be increased significantly by local laser strengthening.

  5. Reaction jet and aerodynamics compound control missile autopilot design based on adaptive fuzzy sliding mode control

    NASA Astrophysics Data System (ADS)

    Wu, Zhenhui; Dong, Chaoyang

    2006-11-01

    Because of nonlinearity and strong coupling of reaction-jet and aerodynamics compound control missile, a missile autopilot design method based on adaptive fuzzy sliding mode control (AFSMC) is proposed in this paper. The universal approximation ability of adaptive fuzzy system is used to approximate the nonlinear function in missile dynamics equation during the flight of high angle of attack. And because the sliding mode control is robustness to external disturbance strongly, the sliding mode surface of the error system is constructed to overcome the influence of approximation error and external disturbance so that the actual overload can track the maneuvering command with high precision. Simulation results show that the missile autopilot designed in this paper not only can track large overload command with higher precision than traditional method, but also is robust to model uncertainty and external disturbance strongly.

  6. Design of Unstructured Adaptive (UA) NAS Parallel Benchmark Featuring Irregular, Dynamic Memory Accesses

    NASA Technical Reports Server (NTRS)

    Feng, Hui-Yu; VanderWijngaart, Rob; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We describe the design of a new method for the measurement of the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. The method involves the solution of a stylized heat transfer problem on an unstructured, adaptive grid. A Spectral Element Method (SEM) with an adaptive, nonconforming mesh is selected to discretize the transport equation. The relatively high order of the SEM lowers the fraction of wall clock time spent on inter-processor communication, which eases the load balancing task and allows us to concentrate on the memory accesses. The benchmark is designed to be three-dimensional. Parallelization and load balance issues of a reference implementation will be described in detail in future reports.

  7. An introduction to the BANNING design automation system for shuttle microelectronic hardware development

    NASA Technical Reports Server (NTRS)

    Mcgrady, W. J.

    1979-01-01

    The BANNING MOS design system is presented. It complements rather than supplant the normal design activities associated with the design and fabrication of low-power digital electronic equipment. BANNING is user-oriented and requires no programming experience to use effectively. It provides the user a simulation capability to aid in his circuit design and it eliminates most of the manual operations involved in the layout and artwork generation of integrated circuits. An example of its operation is given and some additional background reading is provided.

  8. Building Adaptive Game-Based Learning Resources: The Integration of IMS Learning Design and

    ERIC Educational Resources Information Center

    Burgos, Daniel; Moreno-Ger, Pablo; Sierra, Jose Luis; Fernandez-Manjon, Baltasar; Specht, Marcus; Koper, Rob

    2008-01-01

    IMS Learning Design (IMS-LD) is a specification to create units of learning (UoLs), which express a certain pedagogical model or strategy (e.g., adaptive learning with games). However, the authoring process of a UoL remains difficult because of the lack of high-level authoring tools for IMS-LD, even more so when the focus is on specific topics,…

  9. A varying-stage adaptive phase II/III clinical trial design.

    PubMed

    Dong, Gaohong

    2014-04-15

    Currently, adaptive phase II/III clinical trials are typically carried out with a strict two-stage design. The first stage is a learning stage called phase II, and the second stage is a confirmatory stage called phase III. Following phase II analysis, inefficacious or harmful dose arms are dropped, then one or two promising dose arms are selected for the second stage. However, there are often situations in which researchers are in dilemma to make 'go or no-go' decision and/or to select 'best' dose arm(s), as data from the first stage may not provide sufficient information for their decision making. In this case, it is challenging to follow a strict two-stage plan. Therefore, we propose a varying-stage adaptive phase II/III clinical trial design, in which we consider whether there is a need to have an intermediate stage to obtain more data, so that a more informative decision could be made. Hence, the number of further investigational stages in our design is determined on the basis of data accumulated to the interim analysis. With respect to adaptations, we consider dropping dose arm(s), switching another plausible endpoint as the primary study endpoint, re-estimating sample size, and early stopping for futility. We use an adaptive combination test to perform final analyses. By applying closed testing procedure, we control family-wise type I error rate at the nominal level of α in the strong sense. We delineate other essential design considerations including the threshold parameters and the proportion of alpha allocated in the two-stage versus three-stage setting. PMID:24273128

  10. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  11. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter

  12. Genomic Measures to Predict Adaptation to Novel Sensorimotor Environments and Improve Personalization of Countermeasure Design

    NASA Technical Reports Server (NTRS)

    Kreutzberg, G. A.; Zanello, S.; Seidler, R. D.; Peters, B.; De Dios, Y. E.; Gadd, N. E.; Bloomberg, J. J.; Mulavara, A. P.

    2016-01-01

    Introduction. Astronauts experience sensorimotor disturbances during their initial exposure to microgravity and during the re-adaptation phase following a return to an Earth-gravitational environment. These alterations may affect crewmembers' ability to perform mission-critical functional tasks. Interestingly, astronauts have shown significant inter-subject variation in adaptive capability during gravitational transitions. The ability to predict the manner and degree to which individual astronauts would be affected would improve the efficacy of personalized countermeasure training programs designed to enhance sensorimotor adaptability. The success of such an approach depends on the development of predictive measures of sensorimotor adaptation, which would ascertain each crewmember's adaptive capacity. The goal of this study is to determine whether specific genetic polymorphisms have significant influence on sensorimotor adaptability, which can help inform the design of personalized training countermeasures. Methods. Subjects (n=15) were tested on their ability to negotiate a complex obstacle course for ten test trials while wearing up-down vision-displacing goggles. This presented a visuomotor challenge while doing a full body task. The first test trial time and the recovery rate over the ten trials were used as adaptability performance metrics. Four single nucleotide polymorphisms (SNPs) were selected for their role in neural pathways underlying sensorimotor adaptation and were identified in subjects' DNA extracted from saliva samples: catechol-O-methyl transferase (COMT, rs4680), dopamine receptor D2 (DRD2, rs1076560), brain-derived neurotrophic factor genes (BDNF, rs6265), and the DraI polymorphism of the alpha-2 adrenergic receptor. The relationship between the SNPs and test performance was assessed by assigning subjects a rank score based on their adaptability performance metrics and comparing gene expression between the top half and bottom half performers

  13. Cascade direct adaptive fuzzy control design for a nonlinear two-axis inverted-pendulum servomechanism.

    PubMed

    Wai, Rong-Jong; Kuo, Meng-An; Lee, Jeng-Dao

    2008-04-01

    This paper presents and analyzes a cascade direct adaptive fuzzy control (DAFC) scheme for a two-axis inverted-pendulum servomechanism. Because the dynamic characteristic of the two-axis inverted-pendulum servomechanism is a nonlinear unstable nonminimum-phase underactuated system, it is difficult to design a suitable control scheme that simultaneously realizes real-time stabilization and accurate tracking control, and it is not easy to directly apply conventional computed torque strategies to this underactuated system. Therefore, the cascade DAFC scheme including inner and outer control loops is investigated for the stabilizing and tracking control of a nonlinear two-axis inverted-pendulum servomechanism. The goal of the inner control loop is to design a DAFC law so that the stick angle vector can fit the stick angle command vector derived from the stick angle reference model. In the outer loop, the reference signal vector is designed via an adaptive path planner so that the cart position vector tracks the cart position command vector. Moreover, all adaptive algorithms in the cascade DAFC system are derived using the Lyapunov stability analysis, so that system stability can be guaranteed in the entire closed-loop system. Relying on this cascade structure, the stick angle and cart position tracking-error vectors will simultaneously converge to zero. Numerical simulations and experimental results are given to verify that the proposed cascade DAFC system can achieve favorable stabilizing and tracking performance and is robust with regard to system uncertainties. PMID:18348926

  14. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  15. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  16. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  17. Code division controlled-MAC in wireless sensor network by adaptive binary signature design

    NASA Astrophysics Data System (ADS)

    Wei, Lili; Batalama, Stella N.; Pados, Dimitris A.; Suter, Bruce

    2007-04-01

    We consider the problem of signature waveform design for code division medium-access-control (MAC) of wireless sensor networks (WSN). In contract to conventional randomly chosen orthogonal codes, an adaptive signature design strategy is developed under the maximum pre-detection SINR (signal to interference plus noise ratio) criterion. The proposed algorithm utilizes slowest descent cords of the optimization surface to move toward the optimum solution and exhibits, upon eigenvector decomposition, linear computational complexity with respect to signature length. Numerical and simulation studies demonstrate the performance of the proposed method and offer comparisons with conventional signature code sets.

  18. Aluminum reference plate, heat sink, and actuator design for an adaptive secondary mirror

    NASA Astrophysics Data System (ADS)

    del Vecchio, Ciro

    1998-09-01

    The design of an adaptive secondary mirror has to satisfy many requirements coming from different fields. The thin mirror must be actuated very precisely with a large bandwidth. The reference plate has to provide a high stability reference for the optical surfaces. The local seeing is not to be degraded by any significant thermal perturbation. In this article, the performances of a configuration with a single aluminum reference plate, that also provides the heat sink, are computed starting from the input power coming from the magnetic actuators, whose magnetic design has been revised.

  19. Object-oriented software design for the Mt. Wilson 100-inch Hooker telescope adaptive optics system

    NASA Astrophysics Data System (ADS)

    Schneider, Thomas G.

    2000-06-01

    The object oriented software design paradigm has been instrumented in the development of the Adoptics software used in the Hooker telescope's ADOPT adaptive optics system. The software runs on a Pentium-class PC host and eight DSP processors connected to the host's motherboard bus. C++ classes were created to implement most of the host software's functionality, with the object oriented features of inheritance, encapsulation and abstraction being the most useful. Careful class design at the inception of the project allowed for the rapid addition of features without comprising the integrity of the software. Base class implementations include the DSP system, real-time graphical displays and opto-mechanical actuator control.

  20. Investigating the intrinsic cleanliness of automated handling designed for EUV mask pod-in-pod systems

    NASA Astrophysics Data System (ADS)

    Brux, O.; van der Walle, P.; van der Donck, J. C. J.; Dress, P.

    2011-11-01

    Extreme Ultraviolet Lithography (EUVL) is the most promising solution for technology nodes 16nm (hp) and below. However, several unique EUV mask challenges must be resolved for a successful launch of the technology into the market. Uncontrolled introduction of particles and/or contamination into the EUV scanner significantly increases the risk for device yield loss and potentially scanner down-time. With the absence of a pellicle to protect the surface of the EUV mask, a zero particle adder regime between final clean and the point-of-exposure is critical for the active areas of the mask. A Dual Pod concept for handling EUV masks had been proposed by the industry as means to minimize the risk of mask contamination during transport and storage. SuSS-HamaTech introduces MaskTrackPro InSync as a fully automated solution for the handling of EUV masks in and out of this Dual Pod System and therefore constitutes an interface between various tools inside the Fab. The intrinsic cleanliness of each individual handling and storage step of the inner shell (EIP) of this Dual Pod and the EUV mask inside the InSync Tool has been investigated to confirm the capability for minimizing the risk of cross-contamination. An Entegris Dual Pod EUV-1000A-A110 has been used for the qualification. The particle detection for the qualification procedure was executed with the TNO's RapidNano Particle Scanner, qualified for particle sizes down to 50nm (PSL equivalent). It has been shown that the target specification of < 2 particles @ 60nm per 25 cycles has been achieved. In case where added particles were measured, the EIP has been identified as a potential root cause for Ni particle generation. Any direct Ni-Al contact has to be avoided to mitigate the risk of material abrasion.