Science.gov

Sample records for adaptive automation design

  1. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  2. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  3. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  4. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  5. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  6. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  7. Designing Automated Adaptive Support to Improve Student Helping Behaviors in a Peer Tutoring Activity

    ERIC Educational Resources Information Center

    Walker, Erin; Rummel, Nikol; Koedinger, Kenneth R.

    2011-01-01

    Adaptive collaborative learning support systems analyze student collaboration as it occurs and provide targeted assistance to the collaborators. Too little is known about how to design adaptive support to have a positive effect on interaction and learning. We investigated this problem in a reciprocal peer tutoring scenario, where two students take…

  8. Three Experiments Examining the Use of Electroencephalogram,Event-Related Potentials, and Heart-Rate Variability for Real-Time Human-Centered Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.

  9. Effects of adaptive task allocation on monitoring of automated systems

    NASA Technical Reports Server (NTRS)

    Parasuraman, R.; Mouloua, M.; Molloy, R.

    1996-01-01

    The effects of adaptive task allocation on monitoring for automation failure during multitask flight simulation were examined. Participants monitored an automated engine status task while simultaneously performing tracking and fuel management tasks over three 30-min sessions. Two methods of adaptive task allocation, both involving temporary return of the automated engine status task to the human operator ("human control"), were examined as a possible countermeasure to monitoring inefficiency. For the model-based adaptive group, the engine status task was allocated to all participants in the middle of the second session for 10 min, following which it was again returned to automation control. The same occurred for the performance-based adaptive group, but only if an individual participant's monitoring performance up to that point did not meet a specified criterion. For the nonadaptive control groups, the engine status task remained automated throughout the experiment. All groups had low probabilities of detection of automation failures for the first 40 min spent with automation. However, following the 10-min intervening period of human control, both adaptive groups detected significantly more automation failures during the subsequent blocks under automation control. The results show that adaptive task allocation can enhance monitoring of automated systems. Both model-based and performance-based allocation improved monitoring of automation. Implications for the design of automated systems are discussed.

  10. Compact reactor design automation

    NASA Technical Reports Server (NTRS)

    Nassersharif, Bahram; Gaeta, Michael J.

    1991-01-01

    A conceptual compact reactor design automation experiment was performed using the real-time expert system G2. The purpose of this experiment was to investigate the utility of an expert system in design; in particular, reactor design. The experiment consisted of the automation and integration of two design phases: reactor neutronic design and fuel pin design. The utility of this approach is shown using simple examples of formulating rules to ensure design parameter consistency between the two design phases. The ability of G2 to communicate with external programs even across networks provides the system with the capability of supplementing the knowledge processing features with conventional canned programs with possible applications for realistic iterative design tools.

  11. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  12. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  13. Adaptive function allocation reduces performance costs of static automation

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian

    1993-01-01

    Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.

  14. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization.

  15. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  16. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  17. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  18. Automated Core Design

    SciTech Connect

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-07-15

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process.

  19. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  20. Automation design and crew coordination

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.

    1993-01-01

    Advances in technology have greatly impacted the appearance of the modern aircraft cockpit. Where once one would see rows upon rows. The introduction of automation has greatly altered the demands on the pilots and the dynamics of aircrew task performance. While engineers and designers continue to implement the latest technological innovations in the cockpit - claiming higher reliability and decreased workload - a large percentage of aircraft accidents are still attributed to human error. Rather than being the main instigators of accidents, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance and bad management decisions. This paper looks at some of the variables that need to be considered if we are to eliminate at least one of these inheritances - poor design. Specifically, this paper describes the first part of a comprehensive study aimed at identifying the effects of automation on crew coordination.

  1. Adaptation as organism design

    PubMed Central

    Gardner, Andy

    2009-01-01

    The problem of adaptation is to explain the apparent design of organisms. Darwin solved this problem with the theory of natural selection. However, population geneticists, whose responsibility it is to formalize evolutionary theory, have long neglected the link between natural selection and organismal design. Here, I review the major historical developments in theory of organismal adaptation, clarifying what adaptation is and what it is not, and I point out future avenues for research. PMID:19793739

  2. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability.

  3. Automated design of flexible linkers.

    PubMed

    Manion, Charles; Arlitt, Ryan; Campbell, Matthew I; Tumer, Irem; Stone, Rob; Greaney, P Alex

    2016-03-14

    This paper presents a method for the systematic and automated design of flexible organic linkers for construction of metal organic-frameworks (MOFs) in which flexibility, compliance, or other mechanically exotic properties originate at the linker level rather than from the framework kinematics. Our method couples a graph grammar method for systematically generating linker like molecules with molecular dynamics modeling of linkers' mechanical response. Using this approach we have generated a candidate pool of >59,000 hypothetical linkers. We screen linker candidates according to their mechanical behaviors under large deformation, and extract fragments common to the most performant candidate materials. To demonstrate the general approach to MOF design we apply our system to designing linkers for pressure switching MOFs-MOFs that undergo reversible structural collapse after a stress threshold is exceeded. PMID:26687337

  4. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  5. Automation of Design Engineering Processes

    NASA Technical Reports Server (NTRS)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  6. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    ERIC Educational Resources Information Center

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  7. Rebound: A Framework for Automated Component Adaptation

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    The REBOUND adaptation framework organizes a collection of adaptation tactics in a way that they can be selected based on the components available for adaptation. Adaptation tactics are specified formally in terms of the relationship between the component to be adapted and the resulting adapted component. The tactic specifications are used as matching conditions for specification-based component retrieval, creating a 'retrieval for adaptation' scenario. The results of specification matching are used to guide component adaptation. Several examples illustrate how the framework guides component and tactic selection and how basic tactics are composed to form more powerful tactics.

  8. Adaptive Sampling Designs.

    ERIC Educational Resources Information Center

    Flournoy, Nancy

    Designs for sequential sampling procedures that adapt to cumulative information are discussed. A familiar illustration is the play-the-winner rule in which there are two treatments; after a random start, the same treatment is continued as long as each successive subject registers a success. When a failure occurs, the other treatment is used until…

  9. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions.

  10. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions. PMID:21846313

  11. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  12. Computer automation for feedback system design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.

  13. Tools for Automating Instructional Design. ERIC Digest.

    ERIC Educational Resources Information Center

    Kasowitz, Abby

    The instructional design process encompasses a set of interdependent phases including analysis of learners, contexts and goals; design of objectives, strategies and assessment tools; production of instructional materials; and evaluation of learner performance and overall instructional design effort. Automated instructional design (AID) tools…

  14. Automated Hardware Design via Evolutionary Search

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.

    2000-01-01

    The goal of this research is to investigate the application of evolutionary search to the process of automated engineering design. Evolutionary search techniques involve the simulation of Darwinian mechanisms by computer algorithms. In recent years, such techniques have attracted much attention because they are able to tackle a wide variety of difficult problems and frequently produce acceptable solutions. The results obtained are usually functional, often surprising, and typically "messy" because the algorithms are told to concentrate on the overriding objective and not elegance or simplicity. advantages. First, faster design cycles translate into time and, hence, cost savings. Second, automated design techniques can be made to scale well and hence better deal with increasing amounts of design complexity. Third, design quality can increase because design properties can be specified a priori. For example, size and weight specifications of a device, smaller and lighter than the best known design, might be optimized by the automated design technique. The domain of electronic circuit design is an advantageous platform in which to study automated design techniques because it is a rich design space that is well understood, permitting human-created designs to be compared to machine- generated designs. developed for circuit design was to automatically produce high-level integrated electronic circuit designs whose properties permit physical implementation in silicon. This process entailed designing an effective evolutionary algorithm and solving a difficult multiobjective optimization problem. FY 99 saw many accomplishments in this effort.

  15. INITIATORS AND TRIGGERING CONDITIONS FOR ADAPTIVE AUTOMATION IN ADVANCED SMALL MODULAR REACTORS

    SciTech Connect

    Katya L Le Blanc; Johanna h Oxstrand

    2014-04-01

    It is anticipated that Advanced Small Modular Reactors (AdvSMRs) will employ high degrees of automation. High levels of automation can enhance system performance, but often at the cost of reduced human performance. Automation can lead to human out-of the loop issues, unbalanced workload, complacency, and other problems if it is not designed properly. Researchers have proposed adaptive automation (defined as dynamic or flexible allocation of functions) as a way to get the benefits of higher levels of automation without the human performance costs. Adaptive automation has the potential to balance operator workload and enhance operator situation awareness by allocating functions to the operators in a way that is sensitive to overall workload and capabilities at the time of operation. However, there still a number of questions regarding how to effectively design adaptive automation to achieve that potential. One of those questions is related to how to initiate (or trigger) a shift in automation in order to provide maximal sensitivity to operator needs without introducing undesirable consequences (such as unpredictable mode changes). Several triggering mechanisms for shifts in adaptive automation have been proposed including: operator initiated, critical events, performance-based, physiological measurement, model-based, and hybrid methods. As part of a larger project to develop design guidance for human-automation collaboration in AdvSMRs, researchers at Idaho National Laboratory have investigated the effectiveness and applicability of each of these triggering mechanisms in the context of AdvSMR. Researchers reviewed the empirical literature on adaptive automation and assessed each triggering mechanism based on the human-system performance consequences of employing that mechanism. Researchers also assessed the practicality and feasibility of using the mechanism in the context of an AdvSMR control room. Results indicate that there are tradeoffs associated with each

  16. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  17. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  18. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  19. Design considerations for automated packaging operations

    SciTech Connect

    Fahrenholtz, J.; Jones, J.; Kincy, M.

    1993-12-31

    The paper is based on work performed at Sandia National Laboratories to automate DOE packaging operations. It is a general summary of work from several projects which may be applicable to other packaging operations. Examples are provided of robotic operations which have been demonstrated as well as operations that are currently being developed. General design considerations for packages and for automated handling systems are described.

  20. Automated database design technology and tools

    NASA Technical Reports Server (NTRS)

    Shen, Stewart N. T.

    1988-01-01

    The Automated Database Design Technology and Tools research project results are summarized in this final report. Comments on the state of the art in various aspects of database design are provided, and recommendations made for further research for SNAP and NAVMASSO future database applications.

  1. An Automated Approach to Instructional Design Guidance.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…

  2. Automating Relational Database Design for Microcomputer Users.

    ERIC Educational Resources Information Center

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  3. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  4. Automated solar collector installation design

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-08-26

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives.

  5. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  6. Automated Tract Extraction via Atlas Based Adaptive Clustering

    PubMed Central

    Tunç, Birkan; Parker, William A.; Ingalhalikar, Madhura; Verma, Ragini

    2014-01-01

    Advancements in imaging protocols such as the high angular resolution diffusion-weighted imaging (HARDI) and in tractography techniques are expected to cause an increase in the tract-based analyses. Statistical analyses over white matter tracts can contribute greatly towards understanding structural mechanisms of the brain since tracts are representative of the connectivity pathways. The main challenge with tract-based studies is the extraction of the tracts of interest in a consistent and comparable manner over a large group of individuals without drawing the inclusion and exclusion regions of interest. In this work, we design a framework for automated extraction of white matter tracts. The framework introduces three main components, namely a connectivity based fiber representation, a fiber clustering atlas, and a clustering approach called Adaptive Clustering. The fiber representation relies on the connectivity signatures of fibers to establish an easy correspondence between different subjects. A group-wise clustering of these fibers that are represented by the connectivity signatures is then used to generate a fiber bundle atlas. Finally, Adaptive Clustering incorporates the previously generated clustering atlas as a prior, to cluster the fibers of a new subject automatically. Experiments on the HARDI scans of healthy individuals acquired repeatedly, demonstrate the applicability, the reliability and the repeatability of our approach in extracting white matter tracts. By alleviating the seed region selection or the inclusion/exclusion ROI drawing requirements that are usually handled by trained radiologists, the proposed framework expands the range of possible clinical applications and establishes the ability to perform tract-based analyses with large samples. PMID:25134977

  7. Automated mixed traffic vehicle design AMTV 2

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Marks, R. A.; Cassell, P. L.

    1982-01-01

    The design of an improved and enclosed Automated Mixed Traffic Transit (AMTT) vehicle is described. AMTT is an innovative concept for low-speed tram-type transit in which suitable vehicles are equipped with sensors and controls to permit them to operate in an automated mode on existing road or walkway surfaces. The vehicle chassis and body design are presented in terms of sketches and photographs. The functional design of the sensing and control system is presented, and modifications which could be made to the baseline design for improved performance, in particular to incorporate a 20-mph capability, are also discussed. The vehicle system is described at the block-diagram-level of detail. Specifications and parameter values are given where available.

  8. Automated lower limb prosthesis design

    NASA Astrophysics Data System (ADS)

    Bhatia, Gulab H.; Commean, Paul K.; Smith, Kirk E.; Vannier, Michael W.

    1994-09-01

    The design of lower limb prostheses requires definitive geometric data to customize socket shape. Optical surface imaging and spiral x-ray computed tomography were applied to geometric analysis of limb residua in below knee (BK) amputees. Residua (limb remnants after amputation) of BK amputees were digitized and measured. Surface (optical) and volumetric (CT) data of the residuum were used to generate solid models and specify socket shape in (SDRC I-DEAS) CAD software. Volume measurements on the solid models were found to correspond within 2% of surface models and direct determinations made using Archimedean weighing. Anatomic 3D reconstruction of the residuum by optical surface and spiral x-ray computed tomography imaging are feasible modalities for prosthesis design.

  9. Generative Representations for Automated Design of Robots

    NASA Technical Reports Server (NTRS)

    Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2007-01-01

    A method of automated design of complex, modular robots involves an evolutionary process in which generative representations of designs are used. The term generative representations as used here signifies, loosely, representations that consist of or include algorithms, computer programs, and the like, wherein encoded designs can reuse elements of their encoding and thereby evolve toward greater complexity. Automated design of robots through synthetic evolutionary processes has already been demonstrated, but it is not clear whether genetically inspired search algorithms can yield designs that are sufficiently complex for practical engineering. The ultimate success of such algorithms as tools for automation of design depends on the scaling properties of representations of designs. A nongenerative representation (one in which each element of the encoded design is used at most once in translating to the design) scales linearly with the number of elements. Search algorithms that use nongenerative representations quickly become intractable (search times vary approximately exponentially with numbers of design elements), and thus are not amenable to scaling to complex designs. Generative representations are compact representations and were devised as means to circumvent the above-mentioned fundamental restriction on scalability. In the present method, a robot is defined by a compact programmatic form (its generative representation) and the evolutionary variation takes place on this form. The evolutionary process is an iterative one, wherein each cycle consists of the following steps: 1. Generative representations are generated in an evolutionary subprocess. 2. Each generative representation is a program that, when compiled, produces an assembly procedure. 3. In a computational simulation, a constructor executes an assembly procedure to generate a robot. 4. A physical-simulation program tests the performance of a simulated constructed robot, evaluating the performance

  10. Psychophysiological Control of Acognitive Task Using Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Freeman, Frederick; Pope, Alan T. (Technical Monitor)

    2001-01-01

    The major focus of the present proposal was to examine psychophysiological variables related to hazardous states of awareness induced by monitoring automated systems. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While both performance based and model based adaptive automation have been studied, the use of psychophysiological measures, especially EEG, offers the advantage of real time evaluation of the state of the subject. The current study used the closed-loop system, developed at NASA-Langley Research Center, to control the state of awareness of subjects while they performed a cognitive vigilance task. Previous research in our laboratory, supported by NASA, has demonstrated that, in an adaptive automation, closed-loop environment, subjects perform a tracking task better under a negative than a positive, feedback condition. In addition, this condition produces less subjective workload and larger P300 event related potentials to auditory stimuli presented in a concurrent oddball task. We have also recently shown that the closed-loop system used to control the level of automation in a tracking task can also be used to control the event rate of stimuli in a vigilance monitoring task. By changing the event rate based on the subject's index of arousal, we have been able to produce improved monitoring, relative to various control groups. We have demonstrated in our initial closed-loop experiments with the the vigilance paradigm that using a negative feedback contingency (i.e. increasing event rates when the EEG index is low and decreasing event rates when

  11. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  12. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  13. j5 DNA assembly design automation software.

    PubMed

    Hillson, Nathan J; Rosengarten, Rafael D; Keasling, Jay D

    2012-01-20

    Recent advances in Synthetic Biology have yielded standardized and automatable DNA assembly protocols that enable a broad range of biotechnological research and development. Unfortunately, the experimental design required for modern scar-less multipart DNA assembly methods is frequently laborious, time-consuming, and error-prone. Here, we report the development and deployment of a web-based software tool, j5, which automates the design of scar-less multipart DNA assembly protocols including SLIC, Gibson, CPEC, and Golden Gate. The key innovations of the j5 design process include cost optimization, leveraging DNA synthesis when cost-effective to do so, the enforcement of design specification rules, hierarchical assembly strategies to mitigate likely assembly errors, and the instruction of manual or automated construction of scar-less combinatorial DNA libraries. Using a GFP expression testbed, we demonstrate that j5 designs can be executed with the SLIC, Gibson, or CPEC assembly methods, used to build combinatorial libraries with the Golden Gate assembly method, and applied to the preparation of linear gene deletion cassettes for E. coli. The DNA assembly design algorithms reported here are generally applicable to broad classes of DNA construction methodologies and could be implemented to supplement other DNA assembly design tools. Taken together, these innovations save researchers time and effort, reduce the frequency of user design errors and off-target assembly products, decrease research costs, and enable scar-less multipart and combinatorial DNA construction at scales unfeasible without computer-aided design.

  14. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  15. Design of the hybrid automated reliability predictor

    NASA Technical Reports Server (NTRS)

    Geist, R.; Trivedi, K.; Dugan, J. B.; Smotherman, M.

    1983-01-01

    The design of the Hybrid Automated Reliability Predictor (HARP), now under development at Duke University, is presented. The HARP approach to reliability prediction is characterized by a decomposition of the overall model into fault-occurrence and fault-handling sub-models. The fault-occurrence model is a non-homogeneous Markov chain which is solved analytically, while the fault-handling model is a Petri Net which is simulated. HARP provides automated analysis of sensitivity to uncertainties in the input parameters and in the initial state specifications. It then produces a predicted reliability band as a function of mission time, as well as estimates of the improvement (narrowing of the band) to be gained by a specified amount of reduction in uncertainty.

  16. Automated design of ligands to polypharmacological profiles

    PubMed Central

    Besnard, Jérémy; Ruda, Gian Filippo; Setola, Vincent; Abecassis, Keren; Rodriguiz, Ramona M.; Huang, Xi-Ping; Norval, Suzanne; Sassano, Maria F.; Shin, Antony I.; Webster, Lauren A.; Simeons, Frederick R.C.; Stojanovski, Laste; Prat, Annik; Seidah, Nabil G.; Constam, Daniel B.; Bickerton, G. Richard; Read, Kevin D.; Wetsel, William C.; Gilbert, Ian H.; Roth, Bryan L.; Hopkins, Andrew L.

    2012-01-01

    The clinical efficacy and safety of a drug is determined by its activity profile across multiple proteins in the proteome. However, designing drugs with a specific multi-target profile is both complex and difficult. Therefore methods to rationally design drugs a priori against profiles of multiple proteins would have immense value in drug discovery. We describe a new approach for the automated design of ligands against profiles of multiple drug targets. The method is demonstrated by the evolution of an approved acetylcholinesterase inhibitor drug into brain penetrable ligands with either specific polypharmacology or exquisite selectivity profiles for G-protein coupled receptors. Overall, 800 ligand-target predictions of prospectively designed ligands were tested experimentally, of which 75% were confirmed correct. We also demonstrate target engagement in vivo. The approach can be a useful source of drug leads where multi-target profiles are required to achieve either selectivity over other drug targets or a desired polypharmacology. PMID:23235874

  17. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  18. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  19. Adaptive position estimation for an automated guided vehicle

    NASA Astrophysics Data System (ADS)

    Lapin, Brett D.

    1993-05-01

    In a mobile robotic system, complexities in positioning arise due to the motion. An adaptive position estimation scheme has been developed for an automated guide vehicle (AGV) to overcome these complexities. The scheme's purpose is to minimize the position error--the difference between the estimated position and the actual position. The method to achieve this is to adapt the system model by incorporating a parameter vector and using a maximum likelihood algorithm to estimate the parameters after an accurate position determination is made. A simulation of the vehicle's guidance system was developed and the estimator tested on an oval-shaped path. Upon injecting biases into the system, initial position errors were 10 centimeters or more. After the estimator converged, the maximum final errors were on the order of 1 to 2 centimeters (prior to measurement update). After each measurement update, after the estimator had converged, errors were on the order of 1 to 2 millimeters.

  20. Automated Design Space Exploration with Aspen

    DOE PAGES

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  1. Automated Antenna Design with Evolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Linden, Derek; Hornby, Greg; Lohn, Jason; Globus, Al; Krishunkumor, K.

    2006-01-01

    Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to

  2. Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.

    ERIC Educational Resources Information Center

    Jonassen, David H.; Wilson, Brent G.

    Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…

  3. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  4. Designing automation for complex work environments under different levels of stress.

    PubMed

    Sauer, Juergen; Nickel, Peter; Wastell, David

    2013-01-01

    This article examines the effectiveness of different forms of static and adaptable automation under low- and high-stress conditions. Forty participants were randomly assigned to one of four experimental conditions, comparing three levels of static automation (low, medium and high) and one level of adaptable automation, with the environmental stressor (noise) being varied as a within-subjects variable. Participants were trained for 4 h on a simulation of a process control environment, called AutoCAMS, followed by a 2.5-h testing session. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that operators preferred higher levels of automation under noise than under quiet conditions. A number of parameters indicated negative effects of noise exposure, such as performance impairments, physiological stress reactions and higher mental workload. It also emerged that adaptable automation provided advantages over low and intermediate static automation, with regard to mental workload, effort expenditure and diagnostic performance. The article concludes that for the design of automation a wider range of operational scenarios reflecting adverse as well as ideal working conditions needs to be considered.

  5. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  6. A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation

    NASA Technical Reports Server (NTRS)

    Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.

    2004-01-01

    Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed

  7. Adaptive clinical trial designs in oncology

    PubMed Central

    Zang, Yong; Lee, J. Jack

    2015-01-01

    Adaptive designs have become popular in clinical trial and drug development. Unlike traditional trial designs, adaptive designs use accumulating data to modify the ongoing trial without undermining the integrity and validity of the trial. As a result, adaptive designs provide a flexible and effective way to conduct clinical trials. The designs have potential advantages of improving the study power, reducing sample size and total cost, treating more patients with more effective treatments, identifying efficacious drugs for specific subgroups of patients based on their biomarker profiles, and shortening the time for drug development. In this article, we review adaptive designs commonly used in clinical trials and investigate several aspects of the designs, including the dose-finding scheme, interim analysis, adaptive randomization, biomarker-guided randomization, and seamless designs. For illustration, we provide examples of real trials conducted with adaptive designs. We also discuss practical issues from the perspective of using adaptive designs in oncology trials. PMID:25811018

  8. Design of automated system for management of arrival traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1989-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. The design of two of these tools, the Descent Advisor, which provides automation tools for managing descent traffic, and the Traffic Management Advisor, which generates optimum landing schedules is focused on. The algorithms, automation modes, and graphical interfaces incorporated in the design are described.

  9. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  10. Adaptive automation, trust, and self-confidence in fault management of time-critical tasks.

    PubMed

    Moray, N; Inagaki, T; Itoh, M

    2000-03-01

    An experiment on adaptive automation is described. Reliability of automated fault diagnosis, mode of fault management (manual vs. automated), and fault dynamics affect variables including root mean square error, avoidance of accidents and false shutdowns, subjective trust in the system, and operator self-confidence. Results are discussed in relation to levels of automation, models of trust and self-confidence, and theories of human-machine function allocation. Trust in automation but not self-confidence was strongly affected by automation reliability. Operators controlled a continuous process with difficulty only while performing fault management but could prevent unnecessary shutdowns. Final authority for decisions and action must be allocated to automation in time-critical situations.

  11. An Intelligent Automation Platform for Rapid Bioprocess Design.

    PubMed

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity.

  12. Designing Domain-Specific HUMS Architectures: An Automated Approach

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban

    2004-01-01

    The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.

  13. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  14. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  15. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  16. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  17. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  18. Automating the design process - Progress, problems, prospects, potential.

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1973-01-01

    The design process for large aerospace vehicles is discussed, with particular emphasis on structural design. Problems with current procedures are identified. Then, the contributions possible from automating the design process (defined as the best combination of men and computers) are considered. Progress toward automated design in the aerospace and other communities is reviewed, including NASA studies of the potential development of Integrated Programs for Aerospace-Vehicle Design (IPAD). The need for and suggested directions of future research on the design process, both technical and social, are discussed. Although much progress has been made to exploit the computer in design, it is concluded that technology is available to begin using the computer to speed communications and management as well as calculations in the design process and thus build man-computer teams that can design better, faster and cheaper.

  19. Automated AI-based designer of electrical distribution systems

    NASA Astrophysics Data System (ADS)

    Sumic, Zarko

    1992-03-01

    Designing the electrical supply system for new residential developments (plat design) is an everyday task for electric utility engineers. Presently this task is carried out manually resulting in an overdesigned, costly, and nonstandardized solution. As an ill-structured and open-ended problem, plat design is difficult to automate with conventional approaches such as operational research or CAD. Additional complexity in automating plat design is imposed by the need to process spatial data such as circuits' maps, records, and construction plans. The intelligent decision support system for automated electrical plate design (IDSS for AEPD) is an engineering tool aimed at automating plate design. IDSS for AEPD combines the functionality of geographic information systems (GIS) a geographically referenced database, with the sophistication of artificial intelligence (AI) to deal with the complexity inherent in design problems. Blackboard problem solving architecture, concentrated around INGRES relational database and NEXPERT object expert system shell have been chosen to accommodate the diverse knowledge sources and data models. The GIS's principal task it to create, structure, and formalize the real world representation required by the rule based reasoning portion of the AEPD. IDSS's capability to support and enhance the engineer's design, rather than only automate the design process through a prescribed computation, makes it a preferred choice among the possible techniques for AEPD. This paper presents the results of knowledge acquisition and the knowledge engineering process with AEPD tool conceptual design issues. To verify the proposed concept, the comparison of results obtained by the AEPD tool with the design obtained by an experienced human designer is given.

  20. Adapting Assessment Procedures for Delivery via an Automated Format.

    ERIC Educational Resources Information Center

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  1. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    SciTech Connect

    Williams, Joshua M.

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates

  2. Designing for Productive Adaptations of Curriculum Interventions

    ERIC Educational Resources Information Center

    Debarger, Angela Haydel; Choppin, Jeffrey; Beauvineau, Yves; Moorthy, Savitha

    2013-01-01

    Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and…

  3. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dose–volume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  4. Drought Adaptation Mechanisms Should Guide Experimental Design.

    PubMed

    Gilbert, Matthew E; Medina, Viviana

    2016-08-01

    The mechanism, or hypothesis, of how a plant might be adapted to drought should strongly influence experimental design. For instance, an experiment testing for water conservation should be distinct from a damage-tolerance evaluation. We define here four new, general mechanisms for plant adaptation to drought such that experiments can be more easily designed based upon the definitions. A series of experimental methods are suggested together with appropriate physiological measurements related to the drought adaptation mechanisms. The suggestion is made that the experimental manipulation should match the rate, length, and severity of soil water deficit (SWD) necessary to test the hypothesized type of drought adaptation mechanism. PMID:27090148

  5. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  6. Conceptual design of an aircraft automated coating removal system

    SciTech Connect

    Baker, J.E.; Draper, J.V.; Pin, F.G.; Primm, A.H.; Shekhar, S.

    1996-05-01

    Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which is semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).

  7. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  8. Generative Representations for Computer-Automated Design Systems

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.

  9. A Case Study in CAD Design Automation

    ERIC Educational Resources Information Center

    Lowe, Andrew G.; Hartman, Nathan W.

    2011-01-01

    Computer-aided design (CAD) software and other product life-cycle management (PLM) tools have become ubiquitous in industry during the past 20 years. Over this time they have continuously evolved, becoming programs with enormous capabilities, but the companies that use them have not evolved their design practices at the same rate. Due to the…

  10. Automated Design of the Europa Orbiter Tour

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Strange, Nathan J.; Longuski, James M.; Bonfiglio, Eugene P.; Taylor, Irene (Technical Monitor)

    2000-01-01

    In this paper we investigate tours of the Jovian satellites Europa Ganymede, and Callisto for the Europa Orbiter Mission. The principal goal of the tour design is to lower arrival V_ for the final Europa encounter while meeting all of the design constraints. Key constraints arise from considering the total time of the tour and the radiation dosage of a tour. These tours may employ 14 or more encounters with the Jovian satellites. hence there is an enormous number of possible sequences of these satellites to investigate. We develop a graphical method that greatly aids the design process.

  11. DESIGN OF SMALL AUTOMATION WORK CELL SYSTEM DEMONSTRATIONS

    SciTech Connect

    C. TURNER; J. PEHL; ET AL

    2000-12-01

    The introduction of automation systems into many of the facilities dealing with the production, use and disposition of nuclear materials has been an ongoing objective. Many previous attempts have been made, using a variety of monolithic and, in some cases, modular technologies. Many of these attempts were less than successful, owing to the difficulty of the problem, the lack of maturity of the technology, and over optimism about the capabilities of a particular system. Consequently, it is not surprising that suggestions that automation can reduce worker Occupational Radiation Exposure (ORE) levels are often met with skepticism and caution. The development of effective demonstrations of these technologies is of vital importance if automation is to become an acceptable option for nuclear material processing environments. The University of Texas Robotics Research Group (UTRRG) has been pursuing the development of technologies to support modular small automation systems (each of less than 5 degrees-of-freedom) and the design of those systems for more than two decades. Properly designed and implemented, these technologies have a potential to reduce the worker ORE associated with work in nuclear materials processing facilities. Successful development of systems for these applications requires the development of technologies that meet the requirements of the applications. These application requirements form a general set of rules that applicable technologies and approaches need to adhere to, but in and of themselves are generally insufficient for the design of a specific automation system. For the design of an appropriate system, the associated task specifications and relationships need to be defined. These task specifications also provide a means by which appropriate technology demonstrations can be defined. Based on the requirements and specifications of the operations of the Advanced Recovery and Integrated Extraction System (ARIES) pilot line at Los Alamos National

  12. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  13. Automated radiation hard ASIC design tool

    NASA Technical Reports Server (NTRS)

    White, Mike; Bartholet, Bill; Baze, Mark

    1993-01-01

    A commercial based, foundry independent, compiler design tool (ChipCrafter) with custom radiation hardened library cells is described. A unique analysis approach allows low hardness risk for Application Specific IC's (ASIC's). Accomplishments, radiation test results, and applications are described.

  14. CMOS-array design-automation techniques

    NASA Technical Reports Server (NTRS)

    Feller, A.; Lombardt, T.

    1979-01-01

    Thirty four page report discusses design of 4,096-bit complementary metal oxide semiconductor (CMOS) read-only memory (ROM). CMOSROM is either mask or laser programable. Report is divided into six sections; section one describes background of ROM chips; section two presents design goals for chip; section three discusses chip implementation and chip statistics; conclusions and recommendations are given in sections four thru six.

  15. Automated database design from natural language input

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos; Delaune, Carl

    1995-01-01

    Users and programmers of small systems typically do not have the skills needed to design a database schema from an English description of a problem. This paper describes a system that automatically designs databases for such small applications from English descriptions provided by end-users. Although the system has been motivated by the space applications at Kennedy Space Center, and portions of it have been designed with that idea in mind, it can be applied to different situations. The system consists of two major components: a natural language understander and a problem-solver. The paper describes briefly the knowledge representation structures constructed by the natural language understander, and, then, explains the problem-solver in detail.

  16. Library Automation Design for Visually Impaired People

    ERIC Educational Resources Information Center

    Yurtay, Nilufer; Bicil, Yucel; Celebi, Sait; Cit, Guluzar; Dural, Deniz

    2011-01-01

    Speech synthesis is a technology used in many different areas in computer science. This technology can bring a solution to reading activity of visually impaired people due to its text to speech conversion. Based on this problem, in this study, a system is designed needed for a visually impaired person to make use of all the library facilities in…

  17. Towards automation of user interface design

    NASA Technical Reports Server (NTRS)

    Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst

    1992-01-01

    This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.

  18. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  19. Dynamic optimization and adaptive controller design

    NASA Astrophysics Data System (ADS)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  20. Design of automation tools for management of descent traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  1. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  2. Automated Adaptive Brightness in Wireless Capsule Endoscopy Using Image Segmentation and Sigmoid Function.

    PubMed

    Shrestha, Ravi; Mohammed, Shahed K; Hasan, Md Mehedi; Zhang, Xuechao; Wahid, Khan A

    2016-08-01

    Wireless capsule endoscopy (WCE) plays an important role in the diagnosis of gastrointestinal (GI) diseases by capturing images of human small intestine. Accurate diagnosis of endoscopic images depends heavily on the quality of captured images. Along with image and frame rate, brightness of the image is an important parameter that influences the image quality which leads to the design of an efficient illumination system. Such design involves the choice and placement of proper light source and its ability to illuminate GI surface with proper brightness. Light emitting diodes (LEDs) are normally used as sources where modulated pulses are used to control LED's brightness. In practice, instances like under- and over-illumination are very common in WCE, where the former provides dark images and the later provides bright images with high power consumption. In this paper, we propose a low-power and efficient illumination system that is based on an automated brightness algorithm. The scheme is adaptive in nature, i.e., the brightness level is controlled automatically in real-time while the images are being captured. The captured images are segmented into four equal regions and the brightness level of each region is calculated. Then an adaptive sigmoid function is used to find the optimized brightness level and accordingly a new value of duty cycle of the modulated pulse is generated to capture future images. The algorithm is fully implemented in a capsule prototype and tested with endoscopic images. Commercial capsules like Pillcam and Mirocam were also used in the experiment. The results show that the proposed algorithm works well in controlling the brightness level accordingly to the environmental condition, and as a result, good quality images are captured with an average of 40% brightness level that saves power consumption of the capsule. PMID:27333609

  3. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Automated reasoning applications to design validation and sneak function analysis

    SciTech Connect

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System (ACS).

  5. Automated design of multiple encounter gravity-assist trajectories

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1990-01-01

    Given a range of initial launch dates and a set of target planets, a new approach to planetary mission design is developed, using an automated method for finding all conic solutions. Each point on the diagrams reproduced represents a single ballistic trajectory and is computed by modeling the trajectory as a conic section and solving the corresponding Lambert problem for each set of launch and arrival dates. An example which prescribes a launch period of 1975-2050 and target planets Uranus, Saturn, Jupiter, Neptune and Pluto is described whereby, all possible grand tour missions of this class are found, including the Voyager II trajectory. It is determined that this automated design tool may be applied to a variety of multiple encounter gravity-assist trajectories that are being considered for future missions.

  6. Automated and Adaptive Mission Planning for Orbital Express

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Knight, Russell; Jones, Grailing; Tran, Daniel; Koblick, Darin

    2008-01-01

    The Orbital Express space mission was a Defense Advanced Research Projects Agency (DARPA) lead demonstration of on-orbit satellite servicing scenarios, autonomous rendezvous, fluid transfers of hydrazine propellant, and robotic arm transfers of Orbital Replacement Unit (ORU) components. Boeing's Autonomous Space Transport Robotic Operations (ASTRO) vehicle provided the servicing to the Ball Aerospace's Next Generation Serviceable Satellite (NextSat) client. For communication opportunities, operations used the high-bandwidth ground-based Air Force Satellite Control Network (AFSCN) along with the relatively low-bandwidth GEO-Synchronous space-borne Tracking and Data Relay Satellite System (TDRSS) network. Mission operations were conducted out of the RDT&E Support Complex (RSC) at the Kirtland Air Force Base in New Mexico. All mission objectives were met successfully: The first of several autonomous rendezvous was demonstrated on May 5, 2007; autonomous free-flyer capture was demonstrated on June 22, 2007; the fluid and ORU transfers throughout the mission were successful. Planning operations for the mission were conducted by a team of personnel including Flight Directors, who were responsible for verifying the steps and contacts within the procedures, the Rendezvous Planners who would compute the locations and visibilities of the spacecraft, the Scenario Resource Planners (SRPs), who were concerned with assignment of communications windows, monitoring of resources, and sending commands to the ASTRO spacecraft, and the Mission planners who would interface with the real-time operations environment, process planning products and coordinate activities with the SRP. The SRP position was staffed by JPL personnel who used the Automated Scheduling and Planning ENvironment (ASPEN) to model and enforce mission and satellite constraints. The lifecycle of a plan began three weeks outside its execution on-board. During the planning timeframe, many aspects could change the plan

  7. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    NASA Technical Reports Server (NTRS)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  8. Applying Utility Functions to Adaptation Planning for Home Automation Applications

    NASA Astrophysics Data System (ADS)

    Bratskas, Pyrros; Paspallis, Nearchos; Kakousis, Konstantinos; Papadopoulos, George A.

    A pervasive computing environment typically comprises multiple embedded devices that may interact together and with mobile users. These users are part of the environment, and they experience it through a variety of devices embedded in the environment. This perception involves technologies which may be heterogeneous, pervasive, and dynamic. Due to the highly dynamic properties of such environments, the software systems running on them have to face problems such as user mobility, service failures, or resource and goal changes which may happen in an unpredictable manner. To cope with these problems, such systems must be autonomous and self-managed. In this chapter we deal with a special kind of a ubiquitous environment, a smart home environment, and introduce a user-preference-based model for adaptation planning. The model, which dynamically forms a set of configuration plans for resources, reasons automatically and autonomously, based on utility functions, on which plan is likely to best achieve the user's goals with respect to resource availability and user needs.

  9. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Long, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris

    2000-01-01

    Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency

  10. A new design concept for an automated peanut processing facility

    SciTech Connect

    Ertas, A.; Tanju, B.T.; Fair, W.T.; Butts, C.

    1996-12-31

    Peanut quality is a major concern in all phases of the peanut industry from production to manufacturing. Postharvest processing of peanuts can have profound effects on the quality and safety of peanut food products. Curing is a key step in postharvest processing. Curing peanuts improperly can significantly reduce quality, and result in significant losses to both farmers and processors. The conventional drying system designed in the 1960`s is still being used in the processing of the peanuts today. The objectives of this paper is to design and develop a new automated peanut drying system for dry climates capable of handling approximately 20 million lbm of peanuts per harvest season.

  11. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  12. Adaptive Strategies for Materials Design using Uncertainties.

    PubMed

    Balachandran, Prasanna V; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young's (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don't. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  13. Flexible receiver adapter formal design review

    SciTech Connect

    Krieg, S.A.

    1995-06-13

    This memo summarizes the results of the Formal (90%) Design Review process and meetings held to evaluate the design of the Flexible Receiver Adapters, support platforms, and associated equipment. The equipment is part of the Flexible Receiver System used to remove, transport, and store long length contaminated equipment and components from both the double and single-shell underground storage tanks at the 200 area tank farms.

  14. Computerized Adaptive Testing System Design: Preliminary Design Considerations.

    ERIC Educational Resources Information Center

    Croll, Paul R.

    A functional design model for a computerized adaptive testing (CAT) system was developed and presented through a series of hierarchy plus input-process-output (HIPO) diagrams. System functions were translated into system structure: specifically, into 34 software components. Implementation of the design in a physical system was addressed through…

  15. Using digital electronic design flow to create a Genetic Design Automation tool.

    PubMed

    Gendrault, Y; Madec, M; Wlotzko, V; Andraud, M; Lallement, C; Haiech, J

    2012-01-01

    Synthetic bio-systems become increasingly more complex and their development is lengthy and expensive. In the same way, in microelectronics, the design process of very complex circuits has benefited from many years of experience. It is now partly automated through Electronic Design Automation tools. Both areas present analogies that can be used to create a Genetic Design Automation tool inspired from EDA tools used in digital electronics. This tool would allow moving away from a totally manual design of bio-systems to assisted conception. This ambitious project is presented in this paper, with a deep focus on the tool that automatically generates models of bio-systems directly usable in electronic simulators.

  16. Design considerations for CELT adaptive optics

    NASA Astrophysics Data System (ADS)

    Dekany, Richard G.; Nelson, Jerry E.; Bauman, Brian J.

    2000-07-01

    California Institute of Technology and University of California have begun conceptual design studies for a new telescope for astronomical research at visible and infrared wavelengths. The California Extremely Large Telescope (CELT) is currently envisioned as a filled-aperture, steerable, segmented telescope of approximately 30 m diameter. The key to satisfying many of the science goals of this observatory is the availability of diffraction-limited wavefront control. We describe potential observing modes of CELT, including a discussion of the several major outstanding AO system architectural design issues to be resolved prior to the initiation of the detailed design of the adaptive optics capability.

  17. An automated approach to magnetic divertor configuration design

    NASA Astrophysics Data System (ADS)

    Blommaert, M.; Dekeyser, W.; Baelmans, M.; Gauger, N. R.; Reiter, D.

    2015-01-01

    Automated methods based on optimization can greatly assist computational engineering design in many areas. In this paper an optimization approach to the magnetic design of a nuclear fusion reactor divertor is proposed and applied to a tokamak edge magnetic configuration in a first feasibility study. The approach is based on reduced models for magnetic field and plasma edge, which are integrated with a grid generator into one sensitivity code. The design objective chosen here for demonstrative purposes is to spread the divertor target heat load as much as possible over the entire target area. Constraints on the separatrix position are introduced to eliminate physically irrelevant magnetic field configurations during the optimization cycle. A gradient projection method is used to ensure stable cost function evaluations during optimization. The concept is applied to a configuration with typical Joint European Torus (JET) parameters and it automatically provides plausible configurations with reduced heat load.

  18. Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi

    2013-03-01

    Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.

  19. Automation of a portable extracorporeal circulatory support system with adaptive fuzzy controllers.

    PubMed

    Mendoza García, A; Krane, M; Baumgartner, B; Sprunk, N; Schreiber, U; Eichhorn, S; Lange, R; Knoll, A

    2014-08-01

    The presented work relates to the procedure followed for the automation of a portable extracorporeal circulatory support system. Such a device may help increase the chances of survival after suffering from cardiogenic shock outside the hospital, additionally a controller can provide of optimal organ perfusion, while reducing the workload of the operator. Animal experiments were carried out for the acquisition of haemodynamic behaviour of the body under extracorporeal circulation. A mathematical model was constructed based on the experimental data, including a cardiovascular model, gas exchange and the administration of medication. As the base of the controller fuzzy logic was used allowing the easy integration of knowledge from trained perfusionists, an adaptive mechanism was included to adapt to the patient's individual response. Initial simulations show the effectiveness of the controller and the improvements of perfusion after adaptation.

  20. Numerical design of an adaptive aileron

    NASA Astrophysics Data System (ADS)

    Amendola, Gianluca; Dimino, Ignazio; Concilio, Antonio; Magnifico, Marco; Pecora, Rosario

    2016-04-01

    The study herein described is aimed at investigating the feasibility of an innovative full-scale camber morphing aileron device. In the framework of the "Adaptive Aileron" project, an international cooperation between Italy and Canada, this goal was carried out with the integration of different morphing concepts in a wing-tip prototype. As widely demonstrated in recent European projects such as Clean Sky JTI and SARISTU, wing trailing edge morphing may lead to significant drag reduction (up to 6%) in off-design flight points by adapting chord-wise camber variations in cruise to compensate A/C weight reduction following fuel consumption. Those researches focused on the flap region as the most immediate solution to implement structural adaptations. However, there is also a growing interest in extending morphing functionalities to the aileron region preserving its main functionality in controlling aircraft directional stability. In fact, the external region of the wing seems to be the most effective in producing "lift over drag" improvements by morphing. Thus, the objective of the presented research is to achieve a certain drag reduction in off-design flight points by adapting wing shape and lift distribution following static deflections. In perspective, the developed device could also be used as a load alleviation system to reduce gust effects, augmenting its frequency bandwidth. In this paper, the preliminary design of the adaptive aileron is first presented, assessed on the base of the external aerodynamic loads. The primary structure is made of 5 segmented ribs, distributed along 4 bays, each splitted into three consecutive parts, connected with spanwise stringers. The aileron shape modification is then implemented by means of an actuation system, based on a classical quick-return mechanism, opportunely suited for the presented application. Finite element analyses were assessed for properly sizing the load-bearing structure and actuation systems and for

  1. Adaptive control design for hysteretic smart systems

    NASA Astrophysics Data System (ADS)

    McMahan, Jerry A.; Smith, Ralph C.

    2011-04-01

    Ferroelectric and ferromagnetic actuators are being considered for a range of industrial, aerospace, aeronautic and biomedical applications due to their unique transduction capabilities. However, they also exhibit hysteretic and nonlinear behavior that must be accommodated in models and control designs. If uncompensated, these effects can yield reduced system performance and, in the worst case, can produce unpredictable behavior of the control system. In this paper, we address the development of adaptive control designs for hysteretic systems. We review an MRAC-like adaptive control algorithm used to track a reference trajectory while computing online estimates for certain model parameters. This method is incorporated in a composite control algorithm to improve the tracking capabilities of the system. Issues arising in the implementation of these algorithms are addressed, and a numerical example is presented, comparing the results of each method.

  2. Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning

    ERIC Educational Resources Information Center

    Thomas, M. S.; Kothari, D. P.; Prakash, A.

    2011-01-01

    Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…

  3. Automated Microscopy: Macro Language Controlling a Confocal Microscope and its External Illumination: Adaptation for Photosynthetic Organisms.

    PubMed

    Steinbach, Gábor; Kaňa, Radek

    2016-04-01

    Photosynthesis research employs several biophysical methods, including the detection of fluorescence. Even though fluorescence is a key method to detect photosynthetic efficiency, it has not been applied/adapted to single-cell confocal microscopy measurements to examine photosynthetic microorganisms. Experiments with photosynthetic cells may require automation to perform a large number of measurements with different parameters, especially concerning light conditions. However, commercial microscopes support custom protocols (through Time Controller offered by Olympus or Experiment Designer offered by Zeiss) that are often unable to provide special set-ups and connection to external devices (e.g., for irradiation). Our new system combining an Arduino microcontroller with the Cell⊕Finder software was developed for controlling Olympus FV1000 and FV1200 confocal microscopes and the attached hardware modules. Our software/hardware solution offers (1) a text file-based macro language to control the imaging functions of the microscope; (2) programmable control of several external hardware devices (light sources, thermal controllers, actuators) during imaging via the Arduino microcontroller; (3) the Cell⊕Finder software with ergonomic user environment, a fast selection method for the biologically important cells and precise positioning feature that reduces unwanted bleaching of the cells by the scanning laser. Cell⊕Finder can be downloaded from http://www.alga.cz/cellfinder. The system was applied to study changes in fluorescence intensity in Synechocystis sp. PCC6803 cells under long-term illumination. Thus, we were able to describe the kinetics of phycobilisome decoupling. Microscopy data showed that phycobilisome decoupling appears slowly after long-term (>1 h) exposure to high light. PMID:27050040

  4. Automated Microscopy: Macro Language Controlling a Confocal Microscope and its External Illumination: Adaptation for Photosynthetic Organisms.

    PubMed

    Steinbach, Gábor; Kaňa, Radek

    2016-04-01

    Photosynthesis research employs several biophysical methods, including the detection of fluorescence. Even though fluorescence is a key method to detect photosynthetic efficiency, it has not been applied/adapted to single-cell confocal microscopy measurements to examine photosynthetic microorganisms. Experiments with photosynthetic cells may require automation to perform a large number of measurements with different parameters, especially concerning light conditions. However, commercial microscopes support custom protocols (through Time Controller offered by Olympus or Experiment Designer offered by Zeiss) that are often unable to provide special set-ups and connection to external devices (e.g., for irradiation). Our new system combining an Arduino microcontroller with the Cell⊕Finder software was developed for controlling Olympus FV1000 and FV1200 confocal microscopes and the attached hardware modules. Our software/hardware solution offers (1) a text file-based macro language to control the imaging functions of the microscope; (2) programmable control of several external hardware devices (light sources, thermal controllers, actuators) during imaging via the Arduino microcontroller; (3) the Cell⊕Finder software with ergonomic user environment, a fast selection method for the biologically important cells and precise positioning feature that reduces unwanted bleaching of the cells by the scanning laser. Cell⊕Finder can be downloaded from http://www.alga.cz/cellfinder. The system was applied to study changes in fluorescence intensity in Synechocystis sp. PCC6803 cells under long-term illumination. Thus, we were able to describe the kinetics of phycobilisome decoupling. Microscopy data showed that phycobilisome decoupling appears slowly after long-term (>1 h) exposure to high light.

  5. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    NASA Technical Reports Server (NTRS)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  6. Guideline For Design Of Adaptive Structures

    NASA Technical Reports Server (NTRS)

    Utku, Senol; Wada, Ben K.

    1994-01-01

    Guideline for design of adaptive structures specifies active members should be located at positions of maximum strain energy. Equations of motion of flexible structures formulated in terms of kinetic energies, strain energies, and direct measurements of forces. Maintaining precise dimensional control during assembly essential to assembly without large external loads or to prevent jamming of substructure preventing successful deployment. Active members used to prevent "binding" during deployment of structure. Then structure adjusted to precision shape requirement and adjusted during operation as required.

  7. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  8. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  9. The design of an automated electrolytic enrichment apparatus for tritium

    SciTech Connect

    Myers, J.L.

    1994-12-01

    The Radiation Analytical Sciences Section at Laboratory at Lawrence Livermore National Laboratory performs analysis of low-level tritium concentrations in various natural water samples from the Tri-Valley Area, DOE Nevada Test Site, Site 300 in Tracy, CA, and other various places around the world. Low levels of tritium, a radioactive isotope of hydrogen, which is pre-concentrated in the RAS laboratory using an electrolytic enrichment apparatus. Later these enriched waters are analyzed by liquid scintillation counting to determine the activity of tritium. The enrichment procedure and the subsequent purification process by vacuum distillation are currently undertaken manually, hence being highly labor-intensive. The whole process typically takes about 2 to 3 weeks to complete a batch of 30 samples, with a dedicated personnel operating the process. The goal is to automate the entire process, specifically having the operation PC-LabVIEW{trademark} controlled with real-time monitoring capability. My involvement was in the design and fabrication of a prototypical automated electrolytic enrichment cell. Work will be done on optimizing the electrolytic process by assessing the different parameters of the enrichment procedure. Hardware and software development have also been an integral component of this project.

  10. Automated optimum design of wing structures. Deterministic and probabilistic approaches

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1982-01-01

    The automated optimum design of airplane wing structures subjected to multiple behavior constraints is described. The structural mass of the wing is considered the objective function. The maximum stress, wing tip deflection, root angle of attack, and flutter velocity during the pull up maneuver (static load), the natural frequencies of the wing structure, and the stresses induced in the wing structure due to landing and gust loads are suitably constrained. Both deterministic and probabilistic approaches are used for finding the stresses induced in the airplane wing structure due to landing and gust loads. A wing design is represented by a uniform beam with a cross section in the form of a hollow symmetric double wedge. The airfoil thickness and chord length are the design variables, and a graphical procedure is used to find the optimum solutions. A supersonic wing design is represented by finite elements. The thicknesses of the skin and the web and the cross sectional areas of the flanges are the design variables, and nonlinear programming techniques are used to find the optimum solution.

  11. Automated Design of Restraint Layer of an Inflatable Vessel

    NASA Technical Reports Server (NTRS)

    Spexarth, Gary

    2007-01-01

    A Mathcad computer program largely automates the design and analysis of the restraint layer (the primary load-bearing layer) of an inflatable vessel that consists of one or more sections having cylindrical, toroidal, and/or spherical shape(s). A restraint layer typically comprises webbing in the form of multiple straps. The design task includes choosing indexing locations along the straps, computing the load at every location in each strap, computing the resulting stretch at each location, and computing the amount of undersizing required of each strap so that, once the vessel is inflated and the straps thus stretched, the vessel can be expected to assume the desired shape. Prior to the development of this program, the design task was performed by use of a difficult-to-use spreadsheet program that required manual addition of rows and columns depending on the numbers of strap rows and columns of a given design. In contrast, this program is completely parametric and includes logic that automatically adds or deletes rows and columns as needed. With minimal input from the user, this program automatically computes indexing locations, strap lengths, undersizing requirements, and all design data required to produce detailed drawings and assembly procedures. It also generates textual comments that help the user understand the calculations.

  12. Expert system approach to design an automated guided vehicle

    NASA Astrophysics Data System (ADS)

    Kumaraguru, Karthikeyan; Hall, Ernest L.

    1998-10-01

    The purpose of this paper is to describe an expert system to design the base of an automated guided vehicle. The components of the expert system include: (1) A user-friendly graphic user interface, where the user can enter specifications--like the environment used, application of the robot, etc.; (2) An engine that converts the managerial requirements into technical parameters and designs the robot--initially assuming some parameters and confirming its assumptions during the course of the design; when unable to do so, it iterates with different assumptions until they are met; the code also selects various materials to be used from a corresponding database; (3) A database of various materials from their manufacturers/suppliers; (4) The output data is interfaced with a CAD engine, which generates a 3D solid model of the vehicle; and (5) A `Bill of Materials' file is generated as the output and suggestions for how to assemble them are given. The method has been tested by designing a small mobile robot. The software provides an excellent tool to develop a mobile robot based on performance specifications. Modeling helps the user understand the constraints on the design of the robot and the bill of materials--along with the vendor address, helps the user buy the components needed to assemble the robot.

  13. Adaptive Strategies for Materials Design using Uncertainties

    NASA Astrophysics Data System (ADS)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  14. Adaptive Strategies for Materials Design using Uncertainties

    PubMed Central

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties. PMID:26792532

  15. Adaptive strategies for materials design using uncertainties

    DOE PAGES

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  16. Adaptive Mesh Refinement for Microelectronic Device Design

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Lou, John; Norton, Charles

    1999-01-01

    Finite element and finite volume methods are used in a variety of design simulations when it is necessary to compute fields throughout regions that contain varying materials or geometry. Convergence of the simulation can be assessed by uniformly increasing the mesh density until an observable quantity stabilizes. Depending on the electrical size of the problem, uniform refinement of the mesh may be computationally infeasible due to memory limitations. Similarly, depending on the geometric complexity of the object being modeled, uniform refinement can be inefficient since regions that do not need refinement add to the computational expense. In either case, convergence to the correct (measured) solution is not guaranteed. Adaptive mesh refinement methods attempt to selectively refine the region of the mesh that is estimated to contain proportionally higher solution errors. The refinement may be obtained by decreasing the element size (h-refinement), by increasing the order of the element (p-refinement) or by a combination of the two (h-p refinement). A successful adaptive strategy refines the mesh to produce an accurate solution measured against the correct fields without undue computational expense. This is accomplished by the use of a) reliable a posteriori error estimates, b) hierarchal elements, and c) automatic adaptive mesh generation. Adaptive methods are also useful when problems with multi-scale field variations are encountered. These occur in active electronic devices that have thin doped layers and also when mixed physics is used in the calculation. The mesh needs to be fine at and near the thin layer to capture rapid field or charge variations, but can coarsen away from these layers where field variations smoothen and charge densities are uniform. This poster will present an adaptive mesh refinement package that runs on parallel computers and is applied to specific microelectronic device simulations. Passive sensors that operate in the infrared portion of

  17. A Tutorial on Adaptive Design Optimization

    PubMed Central

    Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.

    2013-01-01

    Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275

  18. [Design of an Incremental and Open Laboratory Automation System].

    PubMed

    Xie, Chuanfen; Chen, Yueping; Wang, Zhihong

    2015-07-01

    Recent years have witnessed great development of TLA (Total Laboratory Automation) technology, however, its application hit the bottleneck of high cost and openess to other parties' instruments. Specifically speaking, the initial purchase of the medical devices requires large sum of money and the new system can hardly be compatible with existing equipment. This thesis proposes a new thought for system implementation that through incremental upgrade, the initial capital investment can be reduced and through open architecture and interfaces, the seamless connection of different devices can be achieved. This thesis elaborates on the standards that open architecture design should follow in aspect of mechanics, electro-communication and information interaction and the key technology points in system implementation. PMID:26665947

  19. Effects of extended lay-off periods on performance and operator trust under adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-03-01

    Little is known about the long-term effects of system reliability when operators do not use a system during an extended lay-off period. To examine threats to skill maintenance, 28 participants operated twice a simulation of a complex process control system for 2.5 h, with an 8-month retention interval between sessions. Operators were provided with an adaptable support system, which operated at one of the following reliability levels: 60%, 80% or 100%. Results showed that performance, workload, and trust remained stable at the second testing session, but operators lost self-confidence in their system management abilities. Finally, the effects of system reliability observed at the first testing session were largely found again at the second session. The findings overall suggest that adaptable automation may be a promising means to support operators in maintaining their performance at the second testing session.

  20. Effects of extended lay-off periods on performance and operator trust under adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-03-01

    Little is known about the long-term effects of system reliability when operators do not use a system during an extended lay-off period. To examine threats to skill maintenance, 28 participants operated twice a simulation of a complex process control system for 2.5 h, with an 8-month retention interval between sessions. Operators were provided with an adaptable support system, which operated at one of the following reliability levels: 60%, 80% or 100%. Results showed that performance, workload, and trust remained stable at the second testing session, but operators lost self-confidence in their system management abilities. Finally, the effects of system reliability observed at the first testing session were largely found again at the second session. The findings overall suggest that adaptable automation may be a promising means to support operators in maintaining their performance at the second testing session. PMID:26603139

  1. Automated design of multiphase space missions using hybrid optimal control

    NASA Astrophysics Data System (ADS)

    Chilan, Christian Miguel

    A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving the associated continuous optimal control problem. This strategy, however, will most likely yield a sub-optimal solution, as the problem is sophisticated for several reasons. For example, the number of events in the optimal mission structure is not known a priori and the system equations of motion change depending on what event is current. In this work a framework for the automated design of multiphase space missions is presented using hybrid optimal control (HOC). The method developed uses two nested loops: an outer-loop that handles the discrete dynamics and finds the optimal mission structure in terms of the categorical variables, and an inner-loop that performs the optimization of the corresponding continuous-time dynamical system and obtains the required control history. Genetic algorithms (GA) and direct transcription with nonlinear programming (NLP) are introduced as methods of solution for the outer-loop and inner-loop problems, respectively. Automation of the inner-loop, continuous optimal control problem solver, required two new technologies. The first is a method for the automated construction of the NLP problems resulting from the use of a direct solver for systems with different structures, including different numbers of categorical events. The method assembles modules, consisting of parameters and constraints appropriate to each event, sequentially according to the given mission structure. The other new technology is for a robust initial guess generator required by the inner-loop NLP problem solver. Two new methods were developed for cases including low-thrust trajectories. The first method, based on GA

  2. Adaptive design of visual perception experiments

    NASA Astrophysics Data System (ADS)

    O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja

    2010-04-01

    Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.

  3. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers.

    PubMed

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C; Markley, John L

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-(13)C, U-(15)N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D (1)H-(15)N and (1)H-(13)C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of (1)H, (13)C, and (15)N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use. PMID:24091140

  4. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers

    NASA Astrophysics Data System (ADS)

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C.; Markley, John L.

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-13C, U-15N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D 1H-15N and 1H-13C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of 1H, 13C, and 15N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use.

  5. Fast automated protein NMR data collection and assignment by ADAPT-NMR on Bruker spectrometers.

    PubMed

    Lee, Woonghee; Hu, Kaifeng; Tonelli, Marco; Bahrami, Arash; Neuhardt, Elizabeth; Glass, Karen C; Markley, John L

    2013-11-01

    ADAPT-NMR (Assignment-directed Data collection Algorithm utilizing a Probabilistic Toolkit in NMR) supports automated NMR data collection and backbone and side chain assignment for [U-(13)C, U-(15)N]-labeled proteins. Given the sequence of the protein and data for the orthogonal 2D (1)H-(15)N and (1)H-(13)C planes, the algorithm automatically directs the collection of tilted plane data from a variety of triple-resonance experiments so as to follow an efficient pathway toward the probabilistic assignment of (1)H, (13)C, and (15)N signals to specific atoms in the covalent structure of the protein. Data collection and assignment calculations continue until the addition of new data no longer improves the assignment score. ADAPT-NMR was first implemented on Varian (Agilent) spectrometers [A. Bahrami, M. Tonelli, S.C. Sahu, K.K. Singarapu, H.R. Eghbalnia, J.L. Markley, PLoS One 7 (2012) e33173]. Because of broader interest in the approach, we present here a version of ADAPT-NMR for Bruker spectrometers. We have developed two AU console programs (ADAPT_ORTHO_run and ADAPT_NMR_run) that run under TOPSPIN Versions 3.0 and higher. To illustrate the performance of the algorithm on a Bruker spectrometer, we tested one protein, chlorella ubiquitin (76 amino acid residues), that had been used with the Varian version: the Bruker and Varian versions achieved the same level of assignment completeness (98% in 20 h). As a more rigorous evaluation of the Bruker version, we tested a larger protein, BRPF1 bromodomain (114 amino acid residues), which yielded an automated assignment completeness of 86% in 55 h. Both experiments were carried out on a 500 MHz Bruker AVANCE III spectrometer equipped with a z-gradient 5 mm TCI probe. ADAPT-NMR is available at http://pine.nmrfam.wisc.edu/ADAPT-NMR in the form of pulse programs, the two AU programs, and instructions for installation and use.

  6. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems.

  7. Potential of adaptive clinical trial designs in pharmacogenetic research.

    PubMed

    van der Baan, Frederieke H; Knol, Mirjam J; Klungel, Olaf H; Egberts, Antoine Cg; Grobbee, Diederick E; Roes, Kit C B

    2012-04-01

    Adaptive trial designs can be beneficial in pharmacogenetic research when prior uncertainty exists regarding the exact role and clinical relevance of genetic variability in drug response. This type of design enables us to learn about the effect of the genetic variability on drug response and to immediately use this information for the remainder of the study. For different types of adaptive trial designs, we discuss when and how the designs are suitable for pharmacogenetic research: adaptation of randomization, adaptation of patient enrollment and adaptive enrichment. To illustrate the potential benefits of an adaptive design over a fixed design, we simulated an adaptive trial based on the results of the IPASS trial. With a simple model we show that for this example an adaptive enrichment design would have led to a smaller trial, with less EGF receptor mutation-negative patients unnecessarily exposed to the drug, without compromising the α level or reducing power. PMID:22462749

  8. Software Would Largely Automate Design of Kalman Filter

    NASA Technical Reports Server (NTRS)

    Chuang, Jason C. H.; Negast, William J.

    2005-01-01

    Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.

  9. Valuation of design adaptability in aerospace systems

    NASA Astrophysics Data System (ADS)

    Fernandez Martin, Ismael

    As more information is brought into early stages of the design, more pressure is put on engineers to produce a reliable, high quality, and financially sustainable product. Unfortunately, requirements established at the beginning of a new project by customers, and the environment that surrounds them, continue to change in some unpredictable ways. The risk of designing a system that may become obsolete during early stages of production is currently tackled by the use of robust design simulation, a method that allows to simultaneously explore a plethora of design alternatives and requirements with the intention of accounting for uncertain factors in the future. Whereas this design technique has proven to be quite an improvement in design methods, under certain conditions, it fails to account for the change of uncertainty over time and the intrinsic value embedded in the system when certain design features are activated. This thesis introduces the concepts of adaptability and real options to manage risk foreseen in the face of uncertainty at early design stages. The method described herein allows decision-makers to foresee the financial impact of their decisions at the design level, as well as the final exposure to risk. In this thesis, cash flow models, traditionally used to obtain the forecast of a project's value over the years, were replaced with surrogate models that are capable of showing fluctuations on value every few days. This allowed a better implementation of real options valuation, optimization, and strategy selection. Through the option analysis model, an optimization exercise allows the user to obtain the best implementation strategy in the face of uncertainty as well as the overall value of the design feature. Here implementation strategy refers to the decision to include a new design feature in the system, after the design has been finalized, but before the end of its production life. The ability to do this in a cost efficient manner after the system

  10. [Automated recognition of quasars based on adaptive radial basis function neural networks].

    PubMed

    Zhao, Mei-Fang; Luo, A-Li; Wu, Fu-Chao; Hu, Zhan-Yi

    2006-02-01

    Recognizing and certifying quasars through the research on spectra is an important method in the field of astronomy. This paper presents a novel adaptive method for the automated recognition of quasars based on the radial basis function neural networks (RBFN). The proposed method is composed of the following three parts: (1) The feature space is reduced by the PCA (the principal component analysis) on the normalized input spectra; (2) An adaptive RBFN is constructed and trained in this reduced space. At first, the K-means clustering is used for the initialization, then based on the sum of squares errors and a gradient descent optimization technique, the number of neurons in the hidden layer is adaptively increased to improve the recognition performance; (3) The quasar spectra recognition is effectively carried out by the above trained RBFN. The author's proposed adaptive RBFN is shown to be able to not only overcome the difficulty of selecting the number of neurons in hidden layer of the traditional RBFN algorithm, but also increase the stability and accuracy of recognition of quasars. Besides, the proposed method is particularly useful for automatic voluminous spectra processing produced from a large-scale sky survey project, such as our LAMOST, due to its efficiency.

  11. Designing a VMEbus FDDI adapter card

    NASA Astrophysics Data System (ADS)

    Venkataraman, Raman

    1992-03-01

    This paper presents a system architecture for a VMEbus FDDI adapter card containing a node core, FDDI block, frame buffer memory and system interface unit. Most of the functions of the PHY and MAC layers of FDDI are implemented with National's FDDI chip set and the SMT implementation is simplified with a low cost microcontroller. The factors that influence the system bus bandwidth utilization and FDDI bandwidth utilization are the data path and frame buffer memory architecture. The VRAM based frame buffer memory has two sections - - LLC frame memory and SMT frame memory. Each section with an independent serial access memory (SAM) port provides an independent access after the initial data transfer cycle on the main port and hence, the throughput is maximized on each port of the memory. The SAM port simplifies the system bus master DMA design and the VMEbus interface can be designed with low-cost off-the-shelf interface chips.

  12. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  13. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  14. Automating the design of informative sequences of sensory stimuli

    PubMed Central

    Lewi, Jeremy; Schneider, David M.; Woolley, Sarah M. N.; Paninski, Liam

    2011-01-01

    Adaptive stimulus design methods can potentially improve the efficiency of sensory neurophysiology experiments significantly; however, designing optimal stimulus sequences in real time remains a serious technical challenge. Here we describe two approximate methods for generating informative stimulus sequences: the first approach provides a fast method for scoring the informativeness of a batch of specific potential stimulus sequences, while the second method attempts to compute an optimal stimulus distribution from which the experimenter may easily sample. We apply these methods to single-neuron spike train data recorded from the auditory midbrain of zebra finches, and demonstrate that the resulting stimulus sequences do in fact provide more information about neuronal tuning in a shorter amount of time than do more standard experimental designs. PMID:20556641

  15. The automated design of materials far from equilibrium

    NASA Astrophysics Data System (ADS)

    Miskin, Marc Z.

    Automated design is emerging as a powerful concept in materials science. By combining computer algorithms, simulations, and experimental data, new techniques are being developed that start with high level functional requirements and identify the ideal materials that achieve them. This represents a radically different picture of how materials become functional in which technological demand drives material discovery, rather than the other way around. At the frontiers of this field, materials systems previously considered too complicated can start to be controlled and understood. Particularly promising are materials far from equilibrium. Material robustness, high strength, self-healing and memory are properties displayed by several materials systems that are intrinsically out of equilibrium. These and other properties could be revolutionary, provided they can first be controlled. This thesis conceptualizes and implements a framework for designing materials that are far from equilibrium. We show how, even in the absence of a complete physical theory, design from the top down is possible and lends itself to producing physical insight. As a prototype system, we work with granular materials: collections of athermal, macroscopic identical objects, since these materials function both as an essential component of industrial processes as well as a model system for many non-equilibrium states of matter. We show that by placing granular materials in the context of design, benefits emerge simultaneously for fundamental and applied interests. As first steps, we use our framework to design granular aggregates with extreme properties like high stiffness, and softness. We demonstrate control over nonlinear effects by producing exotic aggregates that stiffen under compression. Expanding on our framework, we conceptualize new ways of thinking about material design when automatic discovery is possible. We show how to build rules that link particle shapes to arbitrary granular packing

  16. ARTS: automated randomization of multiple traits for study design

    PubMed Central

    Maienschein-Cline, Mark; Lei, Zhengdeng; Gardeux, Vincent; Abbasi, Taimur; Machado, Roberto F.; Gordeuk, Victor; Desai, Ankit A.; Saraf, Santosh; Bahroos, Neil; Lussier, Yves

    2014-01-01

    Summary: Collecting data from large studies on high-throughput platforms, such as microarray or next-generation sequencing, typically requires processing samples in batches. There are often systematic but unpredictable biases from batch-to-batch, so proper randomization of biologically relevant traits across batches is crucial for distinguishing true biological differences from experimental artifacts. When a large number of traits are biologically relevant, as is common for clinical studies of patients with varying sex, age, genotype and medical background, proper randomization can be extremely difficult to prepare by hand, especially because traits may affect biological inferences, such as differential expression, in a combinatorial manner. Here we present ARTS (automated randomization of multiple traits for study design), which aids researchers in study design by automatically optimizing batch assignment for any number of samples, any number of traits and any batch size. Availability and implementation: ARTS is implemented in Perl and is available at github.com/mmaiensc/ARTS. ARTS is also available in the Galaxy Tool Shed, and can be used at the Galaxy installation hosted by the UIC Center for Research Informatics (CRI) at galaxy.cri.uic.edu. Contact: mmaiensc@uic.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24493035

  17. Network inference via adaptive optimal design

    PubMed Central

    2012-01-01

    Background Current research in network reverse engineering for genetic or metabolic networks very often does not include a proper experimental and/or input design. In this paper we address this issue in more detail and suggest a method that includes an iterative design of experiments based, on the most recent data that become available. The presented approach allows a reliable reconstruction of the network and addresses an important issue, i.e., the analysis and the propagation of uncertainties as they exist in both the data and in our own knowledge. These two types of uncertainties have their immediate ramifications for the uncertainties in the parameter estimates and, hence, are taken into account from the very beginning of our experimental design. Findings The method is demonstrated for two small networks that include a genetic network for mRNA synthesis and degradation and an oscillatory network describing a molecular network underlying adenosine 3’-5’ cyclic monophosphate (cAMP) as observed in populations of Dyctyostelium cells. In both cases a substantial reduction in parameter uncertainty was observed. Extension to larger scale networks is possible but needs a more rigorous parameter estimation algorithm that includes sparsity as a constraint in the optimization procedure. Conclusion We conclude that a careful experiment design very often (but not always) pays off in terms of reliability in the inferred network topology. For large scale networks a better parameter estimation algorithm is required that includes sparsity as an additional constraint. These algorithms are available in the literature and can also be used in an adaptive optimal design setting as demonstrated in this paper. PMID:22999252

  18. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    ERIC Educational Resources Information Center

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  19. The paradox of automation and the role of human centered design

    SciTech Connect

    Bennett, C.T.; Banks, W.W.

    1993-01-01

    The purpose of this paper is to examine the philosophy of Human Centered Automation (HCA), and how the Nuclear Power Industry (NPI) can take advantage of the history of HCA in other fields. The integration of humans into complex, automated systems has not received the design priority that it should. Operators seem to be viewed almost like any other problem: Design the right automation, and the problem will go away. Such a philosophy prevents the human from being integrated totally into the control process. The nuclear power industry is by no means unique in its need to solve the problem of non-linar, human performance. Designers in other fields are struggling with the same issue: ``What should the role of humans be in a complex, automated system?`` This report presents the philosophy of HCA, examines the paradox of automation, and the role that human centered design has played in dealing with it.

  20. The paradox of automation and the role of human centered design

    SciTech Connect

    Bennett, C.T.; Banks, W.W.

    1993-01-01

    The purpose of this paper is to examine the philosophy of Human Centered Automation (HCA), and how the Nuclear Power Industry (NPI) can take advantage of the history of HCA in other fields. The integration of humans into complex, automated systems has not received the design priority that it should. Operators seem to be viewed almost like any other problem: Design the right automation, and the problem will go away. Such a philosophy prevents the human from being integrated totally into the control process. The nuclear power industry is by no means unique in its need to solve the problem of non-linar, human performance. Designers in other fields are struggling with the same issue: What should the role of humans be in a complex, automated system '' This report presents the philosophy of HCA, examines the paradox of automation, and the role that human centered design has played in dealing with it.

  1. Matters concerned with designing distributed systems for automated control of electrical equipment at power stations

    NASA Astrophysics Data System (ADS)

    Gorozhankin, P. A.; Krasnova, M. E.

    2011-10-01

    Matters concerned with developing the working designs of systems for automated control of electrical equipment are discussed. Basic technical requirements for computerized automation facilities are formulated from the viewpoint of ensuring the required scope of functions and fault tolerance, and proposals for the layout and placement of these facilities are suggested. A special section devoted to protection of automated process control systems from computer viruses is given.

  2. Automated segmentation of retinal pigment epithelium cells in fluorescence adaptive optics images.

    PubMed

    Rangel-Fonseca, Piero; Gómez-Vieyra, Armando; Malacara-Hernández, Daniel; Wilson, Mario C; Williams, David R; Rossi, Ethan A

    2013-12-01

    Adaptive optics (AO) imaging methods allow the histological characteristics of retinal cell mosaics, such as photoreceptors and retinal pigment epithelium (RPE) cells, to be studied in vivo. The high-resolution images obtained with ophthalmic AO imaging devices are rich with information that is difficult and/or tedious to quantify using manual methods. Thus, robust, automated analysis tools that can provide reproducible quantitative information about the cellular mosaics under examination are required. Automated algorithms have been developed to detect the position of individual photoreceptor cells; however, most of these methods are not well suited for characterizing the RPE mosaic. We have developed an algorithm for RPE cell segmentation and show its performance here on simulated and real fluorescence AO images of the RPE mosaic. Algorithm performance was compared to manual cell identification and yielded better than 91% correspondence. This method can be used to segment RPE cells for morphometric analysis of the RPE mosaic and speed the analysis of both healthy and diseased RPE mosaics.

  3. Automated Design of Noise-Minimal, Safe Rotorcraft Trajectories

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Venable, K. Brent; Lindsay, James

    2012-01-01

    NASA and the international community are investing in the development of a commercial transportation infrastructure that includes the increased use of rotorcraft, specifically helicopters and aircraft such as a 40-passenger civil tilt rotors. Rotorcraft have a number of advantages over fixed wing aircraft, primarily in not requiring direct access to the primary fixed wing runways. As such they can operate at an airport without directly interfering with major air carrier and commuter aircraft operations. However, there is significant concern over the impact of noise on the communities surrounding the transportation facilities. In this paper we propose to address the rotorcraft noise problem by exploiting powerful search techniques coming from artificial intelligence, coupled with simulation and field tests, to design trajectories that are expected to improve on the amount of ground noise generated. This paper investigates the use of simulation based on predictive physical models to facilitate the search for low-noise trajectories using a class of automated search algorithms called local search. A novel feature of this approach is the ability to incorporate constraints into the problem formulation that addresses passenger safety and comfort.

  4. The anaerobic SBR process: basic principles for design and automation.

    PubMed

    Ruiz, C; Torrijos, M; Sousbie, P; Lebrato Martinez, J; Moletta, R

    2001-01-01

    This study has determined the purification performance and the basic principles for the design of an anaerobic SBR (ASBR) to be used to treat wastewater generated in the food industries. Two ASBR's were set up and one fed with a slaughterhouse effluent at low concentration, the other with concentrated dairy wastewater. The maximum loading rate applied should not exceed 4.5 g of COD/L/day for the dilute effluent and 6 g of COD/L/day for the concentrated effluent. At higher loading rates, the reactors become difficult to operate, mainly because of sludge removal problems, and purification efficiency declines. A detailed study of the kinetics (TOC, VFA, rate of biogas production) throughout one treatment cycle led to the development of a simple control strategy based on the monitoring of the biogas production rate which was then applied to the reactor treating the dairy wastewater. After automation, the reactor worked free of problems at an average pollution load of 5.4 g of COD/L/day.

  5. Adaptive information design for outdoor augmented reality.

    PubMed

    Neuhöfer, Jan A; Govaers, Felix; El Mokni, Hichem; Alexander, Thomas

    2012-01-01

    Augmented Reality focuses on the enrichment of the user's natural field of view by consistent integration of text, symbols and interactive three-dimensional objects in real time. Placing virtual objects directly into the user's view in a natural context empowers highly dynamic applications. On the other hand, this necessitates deliberate choice of information design and density, in particular for deployment in hazardous environments like military combat scenarios. As the amount of information needed is not foreseeable and strongly depends on the individual mission, an appropriate system must offer adequate adaptation capabilities. The paper presents a prototypical, vehicle-mountable Augmented Reality vision system, designed for enhancing situation awareness in stressful urban warfare scenarios. Tracking, as one of the most crucial challenges for outdoor Augmented Reality, is accomplished by means of a Differential-GPS approach while the type of display to attach can be modified, ranging from ocular displays to standard LCD mini-screens. The overall concept also includes envisioning of own troops (blue forces), for which a multi-sensor tracking approach has been chosen. As a main feature, the system allows switching between different information categories, focusing on friendly, hostile, unidentified or neutral data. Results of an empirical study on the superiority of an in-view navigation cue approach conclude the paper.

  6. Effects of a psychophysiological system for adaptive automation on performance, workload, and the event-related potential P300 component

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.

    2003-01-01

    The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.

  7. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  8. Designing and Implementing Effective Adapted Physical Education Programs

    ERIC Educational Resources Information Center

    Kelly, Luke E.

    2011-01-01

    "Designing and Implementing Effective Adapted Physical Education Programs" was written to assist adapted and general physical educators who are dedicated to ensuring that the physical and motor needs of all their students are addressed in physical education. While it is anticipated that adapted physical educators, where available, will typically…

  9. Automated systems for creative processes in scientific research, design, and robotics

    SciTech Connect

    Glushkov, V.M.; Stognii, A.A.; Biba, I.G.; Vashchenko, N.D.; Galagan, N.I.; Gladun, V.P.; Rabinovich, Z.L.; Sakunov, I.A.; Khomenko, L.V.

    1981-11-01

    The authors give a general description of software that was developed to automate the creative processes in scientific research, design and robotics. The systems APROS, SSP, Analizator-ES and Analizator are discussed. 12 references.

  10. Design and Implementation of an Open, Interoperable AutomatedDemand Response Infrastructure

    SciTech Connect

    Piette, Mary Ann; Kiliccote, Sila; Ghatikar, Girish

    2007-10-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automating demand response (DR). Automating DR allows greater levels of participation and improved reliability and repeatability of the demand response and customer facilities. Automated DR systems have been deployed for critical peak pricing and demand bidding and are being designed for real time pricing. The system is designed to generate, manage, and track DR signals between utilities and Independent System Operators (ISOs) to aggregators and end-use customers and their control systems.

  11. Adaptive trial designs: a review of barriers and opportunities.

    PubMed

    Kairalla, John A; Coffey, Christopher S; Thomann, Mitchell A; Muller, Keith E

    2012-01-01

    Adaptive designs allow planned modifications based on data accumulating within a study. The promise of greater flexibility and efficiency stimulates increasing interest in adaptive designs from clinical, academic, and regulatory parties. When adaptive designs are used properly, efficiencies can include a smaller sample size, a more efficient treatment development process, and an increased chance of correctly answering the clinical question of interest. However, improper adaptations can lead to biased studies. A broad definition of adaptive designs allows for countless variations, which creates confusion as to the statistical validity and practical feasibility of many designs. Determining properties of a particular adaptive design requires careful consideration of the scientific context and statistical assumptions. We first review several adaptive designs that garner the most current interest. We focus on the design principles and research issues that lead to particular designs being appealing or unappealing in particular applications. We separately discuss exploratory and confirmatory stage designs in order to account for the differences in regulatory concerns. We include adaptive seamless designs, which combine stages in a unified approach. We also highlight a number of applied areas, such as comparative effectiveness research, that would benefit from the use of adaptive designs. Finally, we describe a number of current barriers and provide initial suggestions for overcoming them in order to promote wider use of appropriate adaptive designs. Given the breadth of the coverage all mathematical and most implementation details are omitted for the sake of brevity. However, the interested reader will find that we provide current references to focused reviews and original theoretical sources which lead to details of the current state of the art in theory and practice. PMID:22917111

  12. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  13. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    PubMed

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    In the last decade or so, interest in adaptive design clinical trials has gradually been directed towards their use in regulatory submissions by pharmaceutical drug sponsors to evaluate investigational new drugs. Methodological advances of adaptive designs are abundant in the statistical literature since the 1970s. The adaptive design paradigm has been enthusiastically perceived to increase the efficiency and to be more cost-effective than the fixed design paradigm for drug development. Much interest in adaptive designs is in those studies with two-stages, where stage 1 is exploratory and stage 2 depends upon stage 1 results, but where the data of both stages will be combined to yield statistical evidence for use as that of a pivotal registration trial. It was not until the recent release of the US Food and Drug Administration Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics (2010) that the boundaries of flexibility for adaptive designs were specifically considered for regulatory purposes, including what are exploratory goals, and what are the goals of adequate and well-controlled (A&WC) trials (2002). The guidance carefully described these distinctions in an attempt to minimize the confusion between the goals of preliminary learning phases of drug development, which are inherently substantially uncertain, and the definitive inference-based phases of drug development. In this paper, in addition to discussing some aspects of adaptive designs in a confirmatory study setting, we underscore the value of adaptive designs when used in exploratory trials to improve planning of subsequent A&WC trials. One type of adaptation that is receiving attention is the re-estimation of the sample size during the course of the trial. We refer to this type of adaptation as an adaptive statistical information design. Specifically, a case example is used to illustrate how challenging it is to plan a confirmatory adaptive statistical information

  14. Robust design of configurations and parameters of adaptable products

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Chen, Yongliang; Xue, Deyi; Gu, Peihua

    2014-03-01

    An adaptable product can satisfy different customer requirements by changing its configuration and parameter values during the operation stage. Design of adaptable products aims at reducing the environment impact through replacement of multiple different products with single adaptable ones. Due to the complex architecture, multiple functional requirements, and changes of product configurations and parameter values in operation, impact of uncertainties to the functional performance measures needs to be considered in design of adaptable products. In this paper, a robust design approach is introduced to identify the optimal design configuration and parameters of an adaptable product whose functional performance measures are the least sensitive to uncertainties. An adaptable product in this paper is modeled by both configurations and parameters. At the configuration level, methods to model different product configuration candidates in design and different product configuration states in operation to satisfy design requirements are introduced. At the parameter level, four types of product/operating parameters and relations among these parameters are discussed. A two-level optimization approach is developed to identify the optimal design configuration and its parameter values of the adaptable product. A case study is implemented to illustrate the effectiveness of the newly developed robust adaptable design method.

  15. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  16. Situation Awareness Implications of Adaptive Automation of Air Traffic Controller Information Processing Functions

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa

    2004-01-01

    The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the

  17. A Testlet Assembly Design for Adaptive Multistage Tests

    ERIC Educational Resources Information Center

    Luecht, Richard; Brumfield, Terry; Breithaupt, Krista

    2006-01-01

    This article describes multistage tests and some practical test development considerations related to the design and implementation of a multistage test, using the Uniform CPA (certified public accountant) Examination as a case study. The article further discusses the use of automated test assembly procedures in an operational context to produce…

  18. The use of adaptable automation: Effects of extended skill lay-off and changes in system reliability.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain

    2017-01-01

    This experiment aimed to examine how skill lay-off and system reliability would affect operator behaviour in a simulated work environment under wide-range and large-choice adaptable automation comprising six different levels. Twenty-four participants were tested twice during a 2-hr testing session, with the second session taking place 8 months after the first. In the middle of the second testing session, system reliability changed. The results showed that after the retention interval trust increased and self-confidence decreased. Complacency was unaffected by the lay-off period. Diagnostic speed slowed down after the retention interval but diagnostic accuracy was maintained. No difference between experimental conditions was found for automation management behaviour (i.e. level of automation chosen and frequency of switching between levels). There were few effects of system reliability. Overall, the findings showed that subjective measures were more sensitive to the impact of skill lay-off than objective behavioural measures.

  19. The use of adaptable automation: Effects of extended skill lay-off and changes in system reliability.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain

    2017-01-01

    This experiment aimed to examine how skill lay-off and system reliability would affect operator behaviour in a simulated work environment under wide-range and large-choice adaptable automation comprising six different levels. Twenty-four participants were tested twice during a 2-hr testing session, with the second session taking place 8 months after the first. In the middle of the second testing session, system reliability changed. The results showed that after the retention interval trust increased and self-confidence decreased. Complacency was unaffected by the lay-off period. Diagnostic speed slowed down after the retention interval but diagnostic accuracy was maintained. No difference between experimental conditions was found for automation management behaviour (i.e. level of automation chosen and frequency of switching between levels). There were few effects of system reliability. Overall, the findings showed that subjective measures were more sensitive to the impact of skill lay-off than objective behavioural measures. PMID:27633244

  20. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    SciTech Connect

    Adal, Kedir M.; Sidebe, Desire; Ali, Sharib; Chaum, Edward; Karnowski, Thomas Paul; Meriaudeau, Fabrice

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  1. A framework for automated contour quality assurance in radiation therapy including adaptive techniques

    NASA Astrophysics Data System (ADS)

    Altman, M. B.; Kavanaugh, J. A.; Wooten, H. O.; Green, O. L.; DeWees, T. A.; Gay, H.; Thorstad, W. L.; Li, H.; Mutic, S.

    2015-07-01

    Contouring of targets and normal tissues is one of the largest sources of variability in radiation therapy treatment plans. Contours thus require a time intensive and error-prone quality assurance (QA) evaluation, limitations which also impair the facilitation of adaptive radiotherapy (ART). Here, an automated system for contour QA is developed using historical data (the ‘knowledge base’). A pilot study was performed with a knowledge base derived from 9 contours each from 29 head-and-neck treatment plans. Size, shape, relative position, and other clinically-relevant metrics and heuristically derived rules are determined. Metrics are extracted from input patient data and compared against rules determined from the knowledge base; a computer-learning component allows metrics to evolve with more input data, including patient specific data for ART. Nine additional plans containing 42 unique contouring errors were analyzed. 40/42 errors were detected as were 9 false positives. The results of this study imply knowledge-based contour QA could potentially enhance the safety and effectiveness of RT treatment plans as well as increase the efficiency of the treatment planning process, reducing labor and the cost of therapy for patients.

  2. Adaptive Design of Confirmatory Trials: Advances and Challenges

    PubMed Central

    Lai, Tze Leung; Lavori, Philip W.; Tsang, Ka Wai

    2015-01-01

    The past decade witnessed major developments in innovative designs of confirmatory clinical trials, and adaptive designs represent the most active area of these developments. We give an overview of the developments and associated statistical methods in several classes of adaptive designs of confirmatory trials. We also discuss their statistical difficulties and implementation challenges, and show how these problems are connected to other branches of mainstream Statistics, which we then apply to resolve the difficulties and bypass the bottlenecks in the development of adaptive designs for the next decade. PMID:26079372

  3. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  4. Application of Adaptive Autopilot Designs for an Unmanned Aerial Vehicle

    NASA Technical Reports Server (NTRS)

    Shin, Yoonghyun; Calise, Anthony J.; Motter, Mark A.

    2005-01-01

    This paper summarizes the application of two adaptive approaches to autopilot design, and presents an evaluation and comparison of the two approaches in simulation for an unmanned aerial vehicle. One approach employs two-stage dynamic inversion and the other employs feedback dynamic inversions based on a command augmentation system. Both are augmented with neural network based adaptive elements. The approaches permit adaptation to both parametric uncertainty and unmodeled dynamics, and incorporate a method that permits adaptation during periods of control saturation. Simulation results for an FQM-117B radio controlled miniature aerial vehicle are presented to illustrate the performance of the neural network based adaptation.

  5. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  6. Mission Design Evaluation Using Automated Planning for High Resolution Imaging of Dynamic Surface Processes from the ISS

    NASA Technical Reports Server (NTRS)

    Knight, Russell; Donnellan, Andrea; Green, Joseph J.

    2013-01-01

    A challenge for any proposed mission is to demonstrate convincingly that the proposed systems will in fact deliver the science promised. Funding agencies and mission design personnel are becoming ever more skeptical of the abstractions that form the basis of the current state of the practice with respect to approximating science return. To address this, we have been using automated planning and scheduling technology to provide actual coverage campaigns that provide better predictive performance with respect to science return for a given mission design and set of mission objectives given implementation uncertainties. Specifically, we have applied an adaptation of ASPEN and SPICE to the Eagle-Eye domain that demonstrates the performance of the mission design with respect to coverage of science imaging targets that address climate change and disaster response. Eagle-Eye is an Earth-imaging telescope that has been proposed to fly aboard the International Space Station (ISS).

  7. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  8. Multidimensional Adaptive Testing with Optimal Design Criteria for Item Selection

    ERIC Educational Resources Information Center

    Mulder, Joris; van der Linden, Wim J.

    2009-01-01

    Several criteria from the optimal design literature are examined for use with item selection in multidimensional adaptive testing. In particular, it is examined what criteria are appropriate for adaptive testing in which all abilities are intentional, some should be considered as a nuisance, or the interest is in the testing of a composite of the…

  9. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-12-19

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives.

  10. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  11. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  12. Designing Microcomputer Networks (and) LANS: A New Technology to Improve Library Automation.

    ERIC Educational Resources Information Center

    Ivie, Evan L.; Farr, Rick C.

    1984-01-01

    Two articles address the design of microcomputer networks and the use of local area computer networks (LAN) to improve library automation. Topics discussed include network design criteria, media for local networks, transmission mode, typical communication protocols, user interface, basic local network architectures, and examples of microcomputer…

  13. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  14. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine. PMID:27322846

  15. Reliability-Based Design of a Safety-Critical Automation System: A Case Study

    NASA Technical Reports Server (NTRS)

    Carroll, Carol W.; Dunn, W.; Doty, L.; Frank, M. V.; Hulet, M.; Alvarez, Teresa (Technical Monitor)

    1994-01-01

    In 1986, NASA funded a project to modernize the NASA Ames Research Center Unitary Plan Wind Tunnels, including the replacement of obsolescent controls with a modern, automated distributed control system (DCS). The project effort on this system included an independent safety analysis (ISA) of the automation system. The purpose of the ISA was to evaluate the completeness of the hazard analyses which had already been performed on the Modernization Project. The ISA approach followed a tailoring of the risk assessment approach widely used on existing nuclear power plants. The tailoring of the nuclear industry oriented risk assessment approach to the automation system and its role in reliability-based design of the automation system is the subject of this paper.

  16. 28. 'TOWER DESIGN NO. 11, ADAPTED FROM NO. 9,' drawn ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. 'TOWER DESIGN NO. 11, ADAPTED FROM NO. 9,' drawn by project architect Alfred Eichler, undated, ca. 1934. - Sacramento River Bridge, Spanning Sacramento River at California State Highway 275, Sacramento, Sacramento County, CA

  17. System for the automated design on differential-pressure flowmeters with standard constrictors

    SciTech Connect

    Mamonov, Yu.V.; Markov, A.V.; Partrikeev, V.G.; Frenklakh, M.M.

    1995-08-01

    This article examines features of the computing-search subsystem of an automated system for designing differential-pressure flowmeters with standard constrictors. Recent developments within Russia have moved efforts to automate the design and operation of differential-pressure flowmeters (DPF) having a microprocessor-based counter outside the country. The Russian State Commission on Standards therefore assigned the All-Russian Scientific Research Center for Standardization, Information, and Certification of Raw and Processed Materials and Substances (ASRCSMS) the task of developing and introducing standard DPF software based on ISO standards and existing Russian standards. This work was undertaken in accordance with an approved program. Our goal in this article is to substantiate the expediency of developing a system for the automated design of DPFs (DPFADS) to improve on existing practice.

  18. Automation and Accountability in Decision Support System Interface Design

    ERIC Educational Resources Information Center

    Cummings, Mary L.

    2006-01-01

    When the human element is introduced into decision support system design, entirely new layers of social and ethical issues emerge but are not always recognized as such. This paper discusses those ethical and social impact issues specific to decision support systems and highlights areas that interface designers should consider during design with an…

  19. A Parallel Genetic Algorithm for Automated Electronic Circuit Design

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Colombano, Silvano P.; Haith, Gary L.; Stassinopoulos, Dimitris; Norvig, Peter (Technical Monitor)

    2000-01-01

    We describe a parallel genetic algorithm (GA) that automatically generates circuit designs using evolutionary search. A circuit-construction programming language is introduced and we show how evolution can generate practical analog circuit designs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. We present experimental results as applied to analog filter and amplifier design tasks.

  20. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  1. Dynamics of adaptive structures: Design through simulations

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alexander, S.

    1993-01-01

    The use of a helical bi-morph actuator/sensor concept by mimicking the change of helical waveform in bacterial flagella is perhaps the first application of bacterial motions (living species) to longitudinal deployment of space structures. However, no dynamical considerations were analyzed to explain the waveform change mechanisms. The objective is to review various deployment concepts from the dynamics point of view and introduce the dynamical considerations from the outset as part of design considerations. Specifically, the impact of the incorporation of the combined static mechanisms and dynamic design considerations on the deployment performance during the reconfiguration stage is studied in terms of improved controllability, maneuvering duration, and joint singularity index. It is shown that intermediate configurations during articulations play an important role for improved joint mechanisms design and overall structural deployability.

  2. Rise of the Machines: Automated Laser Guide Star Adaptive Optics Observations of Thousands of Objects with Robo-AO

    NASA Astrophysics Data System (ADS)

    Riddle, Reed L.; Baranec, C.; Law, N. M.; Tendulkar, S. P.; Ramaprakash, A. N.; Kulkarni, S. R.; Dekany, R.; Bui, K.; Burse, M.; Das, H.; Punnadi, S.; Chordia, P.

    2013-01-01

    Robo-AO is the first fully automated laser guide star adaptive optics instrument. Robo-AO has completed thousands of automated AO observations at the visible diffraction limit for several scientific programs during its first semester of science observations. These programs include: the Ultimate Binarity Survey to examine stellar binarity properties across the main sequence and beyond; a survey of 1,000 Kepler objects of interest; the multiplicity of solar type stars; and several programs for high precision astrometric observations. A new infrared camera is under development for Robo-AO, and a clone of the system is in the planning stages. This presentation will discuss the Robo-AO instrument capabilities, summarize the science programs undertaken, and discuss the future of Robo-AO.

  3. Application of the Modular Automated Reconfigurable Assembly System (MARAS) concept to adaptable vision gauging and parts feeding

    NASA Technical Reports Server (NTRS)

    By, Andre Bernard; Caron, Ken; Rothenberg, Michael; Sales, Vic

    1994-01-01

    This paper presents the first phase results of a collaborative effort between university researchers and a flexible assembly systems integrator to implement a comprehensive modular approach to flexible assembly automation. This approach, named MARAS (Modular Automated Reconfigurable Assembly System), has been structured to support multiple levels of modularity in terms of both physical components and system control functions. The initial focus of the MARAS development has been on parts gauging and feeding operations for cylinder lock assembly. This phase is nearing completion and has resulted in the development of a highly configurable system for vision gauging functions on a wide range of small components (2 mm to 100 mm in size). The reconfigurable concepts implemented in this adaptive Vision Gauging Module (VGM) are now being extended to applicable aspects of the singulating, selecting, and orienting functions required for the flexible feeding of similar mechanical components and assemblies.

  4. Design of microcontroller based system for automation of streak camera

    SciTech Connect

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  5. Formulation of design guidelines for automated robotic assembly in outerspace

    NASA Technical Reports Server (NTRS)

    Dwivedi, Suren N.; Jones, Gary; Banerjee, S.; Srivastava, S.

    1989-01-01

    The approach for arriving at design guidelines for assembly by robots in outerspace is illustrated. The use of robots in a zero gravity environment necessitates that extra factors over and above normal design guidelines be taken into account. Besides, many of the guidelines for assembly by robots on earth do not apply in space. However, considering the axioms for normal design and assembly as one set, guidelines for design and robotic assembly as another, and guidelines for design and assembly in space as the third set, unions and intersections of these sets can generate guidelines for two or more of these conditions taken together - say design and manual assembly in space. Therein lies the potential to develop expert systems in the future, which would use an exhaustive database and similar guidelines to arrive at those required by a superposition of these conditions.

  6. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  7. Adaptive two-stage designs in phase II clinical trials.

    PubMed

    Banerjee, Anindita; Tsiatis, Anastasios A

    2006-10-15

    Two-stage designs have been widely used in phase II clinical trials. Such designs are desirable because they allow a decision to be made on whether a treatment is effective or not after the accumulation of the data at the end of each stage. Optimal fixed two-stage designs, where the sample size at each stage is fixed in advance, were proposed by Simon when the primary outcome is a binary response. This paper proposes an adaptive two-stage design which allows the sample size at the second stage to depend on the results at the first stage. Using a Bayesian decision-theoretic construct, we derive optimal adaptive two-stage designs; the optimality criterion being minimum expected sample size under the null hypothesis. Comparisons are made between Simon's two-stage fixed design and the new design with respect to this optimality criterion. PMID:16479547

  8. Using IDE in Instructional Design: Encouraging Reflective Instruction Design Through Automated Design Tools.

    ERIC Educational Resources Information Center

    Russell, Daniel M.; Kelley, Loretta

    The Instructional Design Environment (IDE), a computer assisted instruction tool for instructional design, has been incorporated into the curriculum and instructional development in mathematics instruction in the Stanford Teacher Education Program (STEP). (STEP is a 12-month program leading to an M.A. in education which emphasizes content focus…

  9. Modeling biology with HDL languages: a first step toward a genetic design automation tool inspired from microelectronics.

    PubMed

    Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques

    2014-04-01

    Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.

  10. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  11. Design, development, test, and evaluation of an automated analytical electrophoresis apparatus

    NASA Technical Reports Server (NTRS)

    Bartels, P. A.; Bier, M.

    1977-01-01

    An Automated Analytical Electrophoresis Apparatus (AAEA) was designed, developed, assembled, and preliminarily tested. The AAEA was demonstrated to be a feasible apparatus for automatically acquiring, displaying, and storing (and eventually analyzing) electrophoresis mobility data from living blood cells. The apparatus and the operation of its major assemblies are described in detail.

  12. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  13. Designing Adaptive Low Dissipative High Order Schemes

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Sjoegreen, B.; Parks, John W. (Technical Monitor)

    2002-01-01

    Proper control of the numerical dissipation/filter to accurately resolve all relevant multiscales of complex flow problems while still maintaining nonlinear stability and efficiency for long-time numerical integrations poses a great challenge to the design of numerical methods. The required type and amount of numerical dissipation/filter are not only physical problem dependent, but also vary from one flow region to another. This is particularly true for unsteady high-speed shock/shear/boundary-layer/turbulence/acoustics interactions and/or combustion problems since the dynamics of the nonlinear effect of these flows are not well-understood. Even with extensive grid refinement, it is of paramount importance to have proper control on the type and amount of numerical dissipation/filter in regions where it is needed.

  14. Design of Low Complexity Model Reference Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Johnson, Marcus; Nguyen, Nhan

    2012-01-01

    Flight research experiments have demonstrated that adaptive flight controls can be an effective technology for improving aircraft safety in the event of failures or damage. However, the nonlinear, timevarying nature of adaptive algorithms continues to challenge traditional methods for the verification and validation testing of safety-critical flight control systems. Increasingly complex adaptive control theories and designs are emerging, but only make testing challenges more difficult. A potential first step toward the acceptance of adaptive flight controllers by aircraft manufacturers, operators, and certification authorities is a very simple design that operates as an augmentation to a non-adaptive baseline controller. Three such controllers were developed as part of a National Aeronautics and Space Administration flight research experiment to determine the appropriate level of complexity required to restore acceptable handling qualities to an aircraft that has suffered failures or damage. The controllers consist of the same basic design, but incorporate incrementally-increasing levels of complexity. Derivations of the controllers and their adaptive parameter update laws are presented along with details of the controllers implementations.

  15. Automation for pattern library creation and in-design optimization

    NASA Astrophysics Data System (ADS)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also

  16. Scar-less multi-part DNA assembly design automation

    DOEpatents

    Hillson, Nathan J.

    2016-06-07

    The present invention provides a method of a method of designing an implementation of a DNA assembly. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding flanking homology sequences to each of the DNA oligos. In an exemplary embodiment, the method includes (1) receiving a list of DNA sequence fragments to be assembled together and an order in which to assemble the DNA sequence fragments, (2) designing DNA oligonucleotides (oligos) for each of the DNA sequence fragments, and (3) creating a plan for adding optimized overhang sequences to each of the DNA oligos.

  17. Automated Design and Optimization of Pebble-bed Reactor Cores

    SciTech Connect

    Hans D. Gougar; Abderrafi M. Ougouag; William K. Terry

    2010-07-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  18. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  19. The Potential of Adaptive Design in Animal Studies.

    PubMed

    Majid, Arshad; Bae, Ok-Nam; Redgrave, Jessica; Teare, Dawn; Ali, Ali; Zemke, Daniel

    2015-10-12

    Clinical trials are the backbone of medical research, and are often the last step in the development of new therapies for use in patients. Prior to human testing, however, preclinical studies using animal subjects are usually performed in order to provide initial data on the safety and effectiveness of prospective treatments. These studies can be costly and time consuming, and may also raise concerns about the ethical treatment of animals when potentially harmful procedures are involved. Adaptive design is a process by which the methods used in a study may be altered while it is being conducted in response to preliminary data or other new information. Adaptive design has been shown to be useful in reducing the time and costs associated with clinical trials, and may provide similar benefits in preclinical animal studies. The purpose of this review is to summarize various aspects of adaptive design and evaluate its potential for use in preclinical research.

  20. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  1. The design of digital-adaptive controllers for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.; Broussard, J. R.; Berry, P. W.

    1976-01-01

    Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.

  2. Using Adaptive Automation to Increase Operator Performance and Decrease Stress in a Satellite Operations Environment

    ERIC Educational Resources Information Center

    Klein, David C.

    2014-01-01

    As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…

  3. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research.

  4. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research. PMID:19053496

  5. A Multi-Agent Design for Power Distribution Systems Automation

    NASA Astrophysics Data System (ADS)

    Ghorbani, M. Jawad

    A new Multi Agent System (MAS) design for fault location, isolation and restoration in power distribution systems is presented. In proposed approach, when there is a fault in the Power Distribution System (PDS), MAS quickly isolates the fault and restores the service to fault-free zones. Hierarchical coordination strategy is introduced to manage the agents which integrate the advantages of both centralized and decentralized coordination strategies. In this framework, Zone Agent (ZA) locate and isolate the fault based on the locally available information and assist the Feeder Agent (FA) for reconfiguration and restoration. FA can solve the restoration problem using the existing algorithms for the 0-1 Knapsack problem. A novel Q-learning mechanism is also introduced to support the FAs in decision making for restoration. Also a distributed MAS-Based Load Shedding (LS) technique has been used to supply as many of higher priority customers as possible, in case there is more demand than generation. The design is illustrated by the use of simulation case studies for fault location, isolation and restoration on West Virginia Super Circuit (WVSC) and hardware implementation for fault location and isolation in a laboratory platform. The results from the case studies indicate the performance of proposed MAS designs.

  6. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    PubMed

    Ker, Dai Fei Elmer; Weiss, Lee E; Junkers, Silvina N; Chen, Mei; Yin, Zhaozheng; Sandbothe, Michael F; Huh, Seung-il; Eom, Sungeun; Bise, Ryoma; Osuna-Highley, Elvira; Kanade, Takeo; Campbell, Phil G

    2011-01-01

    Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and developing robotic cell

  7. Design principles and algorithms for automated air traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    This paper presents design principles and algorithm for building a real time scheduler. The primary objective of the scheduler is to assign arrival aircraft to a favorable landing runway and schedule them to land at times that minimize delays. A further objective of the scheduler is to allocate delays between high altitude airspace far from the airport and low altitude airspace near the airport. A method of delay allocation is described that minimizes the average operating cost in the presence of errors in controlling aircraft to a specified landing time.

  8. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  9. Missile guidance law design using adaptive cerebellar model articulation controller.

    PubMed

    Lin, Chih-Min; Peng, Ya-Fu

    2005-05-01

    An adaptive cerebellar model articulation controller (CMAC) is proposed for command to line-of-sight (CLOS) missile guidance law design. In this design, the three-dimensional (3-D) CLOS guidance problem is formulated as a tracking problem of a time-varying nonlinear system. The adaptive CMAC control system is comprised of a CMAC and a compensation controller. The CMAC control is used to imitate a feedback linearization control law and the compensation controller is utilized to compensate the difference between the feedback linearization control law and the CMAC control. The online adaptive law is derived based on the Lyapunov stability theorem to learn the weights of receptive-field basis functions in CMAC control. In addition, in order to relax the requirement of approximation error bound, an estimation law is derived to estimate the error bound. Then the adaptive CMAC control system is designed to achieve satisfactory tracking performance. Simulation results for different engagement scenarios illustrate the validity of the proposed adaptive CMAC-based guidance law.

  10. DESIGN AND PRELIMINARY VALIDATION OF A RAPID AUTOMATED BIODOSIMETRY TOOL FOR HIGH THROUGPUT RADIOLOGICAL TRIAGE

    PubMed Central

    Chen, Youhua; Zhang, Jian; Wang, Hongliang; Garty, Guy; Xu, Yanping; Lyulko, Oleksandra V.; Turner, Helen C.; Randers-Pehrson, Gerhard; Simaan, Nabil; Yao, Y. Lawrence; Brenner, D. J.

    2010-01-01

    This paper presents design, hardware, software, and parameter optimization for a novel robotic automation system. RABiT is a Rapid Automated Biodosimetry Tool for high throughput radiological triage. The design considerations guiding the hardware and software architecture are presented with focus on methods of communication, ease of implementation, and need for real-time control versus soft time control cycles. The design and parameter determination for a non-contact PVC capillary laser cutting system is presented. A novel approach for lymphocyte concentration estimation based on computer vision is reported. Experimental evaluations of the system components validate the success of our prototype system in achieving a throughput of 6,000 samples in a period of 18 hours. PMID:21258614

  11. Experimental Design to Evaluate Directed Adaptive Mutation in Mammalian Cells

    PubMed Central

    Chiaro, Christopher R; May, Tobias

    2014-01-01

    Background We describe the experimental design for a methodological approach to determine whether directed adaptive mutation occurs in mammalian cells. Identification of directed adaptive mutation would have profound practical significance for a wide variety of biomedical problems, including disease development and resistance to treatment. In adaptive mutation, the genetic or epigenetic change is not random; instead, the presence and type of selection influences the frequency and character of the mutation event. Adaptive mutation can contribute to the evolution of microbial pathogenesis, cancer, and drug resistance, and may become a focus of novel therapeutic interventions. Objective Our experimental approach was designed to distinguish between 3 types of mutation: (1) random mutations that are independent of selective pressure, (2) undirected adaptive mutations that arise when selective pressure induces a general increase in the mutation rate, and (3) directed adaptive mutations that arise when selective pressure induces targeted mutations that specifically influence the adaptive response. The purpose of this report is to introduce an experimental design and describe limited pilot experiment data (not to describe a complete set of experiments); hence, it is an early report. Methods An experimental design based on immortalization of mouse embryonic fibroblast cells is presented that links clonal cell growth to reversal of an inactivating polyadenylation site mutation. Thus, cells exhibit growth only in the presence of both the countermutation and an inducing agent (doxycycline). The type and frequency of mutation in the presence or absence of doxycycline will be evaluated. Additional experimental approaches would determine whether the cells exhibit a generalized increase in mutation rate and/or whether the cells show altered expression of error-prone DNA polymerases or of mismatch repair proteins. Results We performed the initial stages of characterizing our system

  12. Engineering Design and Automation in the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory.

    SciTech Connect

    Wantuck, P. J.; Hollen, R. M.

    2002-01-01

    This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and process the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation

  13. The Adaptive Physical Education Program: Its Design and Curriculum.

    ERIC Educational Resources Information Center

    Henn, Joan M.

    The booklet describes a program designed to improve the play skills of seriously disturbed, multiply handicapped children (5 to 13 years old) through gross motor skill development. A six step process is described for the adaptive physical education program: assessment, development of interdisciplinary goals, interventions, development of goals for…

  14. Adaptive multibeam phased array design for a Spacelab experiment

    NASA Technical Reports Server (NTRS)

    Noji, T. T.; Fass, S.; Fuoco, A. M.; Wang, C. D.

    1977-01-01

    The parametric tradeoff analyses and design for an Adaptive Multibeam Phased Array (AMPA) for a Spacelab experiment are described. This AMPA Experiment System was designed with particular emphasis to maximize channel capacity and minimize implementation and cost impacts for future austere maritime and aeronautical users, operating with a low gain hemispherical coverage antenna element, low effective radiated power, and low antenna gain-to-system noise temperature ratio.

  15. Adaptive Liver Stereotactic Body Radiation Therapy: Automated Daily Plan Reoptimization Prevents Dose Delivery Degradation Caused by Anatomy Deformations

    SciTech Connect

    Leinders, Suzanne M.; Breedveld, Sebastiaan; Méndez Romero, Alejandra; Schaart, Dennis; Seppenwoolde, Yvette; Heijmen, Ben J.M.

    2013-12-01

    Purpose: To investigate how dose distributions for liver stereotactic body radiation therapy (SBRT) can be improved by using automated, daily plan reoptimization to account for anatomy deformations, compared with setup corrections only. Methods and Materials: For 12 tumors, 3 strategies for dose delivery were simulated. In the first strategy, computed tomography scans made before each treatment fraction were used only for patient repositioning before dose delivery for correction of detected tumor setup errors. In adaptive second and third strategies, in addition to the isocenter shift, intensity modulated radiation therapy beam profiles were reoptimized or both intensity profiles and beam orientations were reoptimized, respectively. All optimizations were performed with a recently published algorithm for automated, multicriteria optimization of both beam profiles and beam angles. Results: In 6 of 12 cases, violations of organs at risk (ie, heart, stomach, kidney) constraints of 1 to 6 Gy in single fractions occurred in cases of tumor repositioning only. By using the adaptive strategies, these could be avoided (<1 Gy). For 1 case, this needed adaptation by slightly underdosing the planning target volume. For 2 cases with restricted tumor dose in the planning phase to avoid organ-at-risk constraint violations, fraction doses could be increased by 1 and 2 Gy because of more favorable anatomy. Daily reoptimization of both beam profiles and beam angles (third strategy) performed slightly better than reoptimization of profiles only, but the latter required only a few minutes of computation time, whereas full reoptimization took several hours. Conclusions: This simulation study demonstrated that replanning based on daily acquired computed tomography scans can improve liver stereotactic body radiation therapy dose delivery.

  16. Individually designed PALs vs. power optimized PALs adaptation comparison.

    PubMed

    Muždalo, Nataša Vujko; Mihelčič, Matjaž

    2015-03-01

    The practice shows that in everyday life we encounter ever-growing demand for better visual acuity at all viewing distances. The presbyopic population needs correction to far, near and intermediate distance with different dioptric powers. PAL lenses seem to be a comfortable solution. The object of the present study is the analysis of the factors determining adaptation to progressive addition lenses (PAL) of the first-time users. Only novice test persons were chosen in order to avoid the bias of previously worn particular lens design. For optimal results with this type of lens, several individual parameters must be considered: correct refraction, precise ocular and facial measures, and proper mounting of lenses into the frame. Nevertheless, first time wearers encounter various difficulties in the process of adapting to this type of glasses and adaptation time differs greatly between individual users. The question that arises is how much the individual parameters really affect the ease of adaptation and comfort when wearing progressive glasses. To clarify this, in the present study, the individual PAL lenses--Rodenstock's Impression FreeSign (with inclusion of all parameters related to the user's eye and spectacle frame: prescription, pupillary distance, fitting height, back vertex distance, pantoscopic angle and curvature of the frame) were compared to power optimized PAL--Rodenstock's Multigressiv MyView (respecting only prescription power and pupillary distance). Adaptation process was monitored over a period of four weeks. The collected results represent scores of user's subjective impressions, where the users themselves rated their adaptation to new progressive glasses and the degree of subjective visual impression. The results show that adaptation time to fully individually fit PAL is easier and quickly. The information obtained from users is valuable in everyday optometry practice because along with the manufacturer's specifications, the user's experience can

  17. Individually designed PALs vs. power optimized PALs adaptation comparison.

    PubMed

    Muždalo, Nataša Vujko; Mihelčič, Matjaž

    2015-03-01

    The practice shows that in everyday life we encounter ever-growing demand for better visual acuity at all viewing distances. The presbyopic population needs correction to far, near and intermediate distance with different dioptric powers. PAL lenses seem to be a comfortable solution. The object of the present study is the analysis of the factors determining adaptation to progressive addition lenses (PAL) of the first-time users. Only novice test persons were chosen in order to avoid the bias of previously worn particular lens design. For optimal results with this type of lens, several individual parameters must be considered: correct refraction, precise ocular and facial measures, and proper mounting of lenses into the frame. Nevertheless, first time wearers encounter various difficulties in the process of adapting to this type of glasses and adaptation time differs greatly between individual users. The question that arises is how much the individual parameters really affect the ease of adaptation and comfort when wearing progressive glasses. To clarify this, in the present study, the individual PAL lenses--Rodenstock's Impression FreeSign (with inclusion of all parameters related to the user's eye and spectacle frame: prescription, pupillary distance, fitting height, back vertex distance, pantoscopic angle and curvature of the frame) were compared to power optimized PAL--Rodenstock's Multigressiv MyView (respecting only prescription power and pupillary distance). Adaptation process was monitored over a period of four weeks. The collected results represent scores of user's subjective impressions, where the users themselves rated their adaptation to new progressive glasses and the degree of subjective visual impression. The results show that adaptation time to fully individually fit PAL is easier and quickly. The information obtained from users is valuable in everyday optometry practice because along with the manufacturer's specifications, the user's experience can

  18. An information theoretic approach of designing sparse kernel adaptive filters.

    PubMed

    Liu, Weifeng; Park, Il; Principe, José C

    2009-12-01

    This paper discusses an information theoretic approach of designing sparse kernel adaptive filters. To determine useful data to be learned and remove redundant ones, a subjective information measure called surprise is introduced. Surprise captures the amount of information a datum contains which is transferable to a learning system. Based on this concept, we propose a systematic sparsification scheme, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters. Nonlinear regression, short term chaotic time-series prediction, and long term time-series forecasting examples are presented. PMID:19923047

  19. An information theoretic approach of designing sparse kernel adaptive filters.

    PubMed

    Liu, Weifeng; Park, Il; Principe, José C

    2009-12-01

    This paper discusses an information theoretic approach of designing sparse kernel adaptive filters. To determine useful data to be learned and remove redundant ones, a subjective information measure called surprise is introduced. Surprise captures the amount of information a datum contains which is transferable to a learning system. Based on this concept, we propose a systematic sparsification scheme, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters. Nonlinear regression, short term chaotic time-series prediction, and long term time-series forecasting examples are presented.

  20. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  1. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  2. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  3. Covariate-adjusted response-adaptive designs for binary response.

    PubMed

    Rosenberger, W F; Vidyashankar, A N; Agarwal, D K

    2001-11-01

    An adaptive allocation design for phase III clinical trials that incorporates covariates is described. The allocation scheme maps the covariate-adjusted odds ratio from a logistic regression model onto [0, 1]. Simulations assume that both staggered entry and time to response are random and follow a known probability distribution that can depend on the treatment assigned, the patient's response, a covariate, or a time trend. Confidence intervals on the covariate-adjusted odds ratio is slightly anticonservative for the adaptive design under the null hypothesis, but power is similar to equal allocation under various alternatives for n = 200. For similar power, the net savings in terms of expected number of treatment failures is modest, but enough to make this design attractive for certain studies where known covariates are expected to be important and stratification is not desired, and treatment failures have a high ethical cost.

  4. Design of scheduling and rate-adaptation algorithms for adaptive HTTP streaming

    NASA Astrophysics Data System (ADS)

    Hesse, Stephan

    2013-09-01

    In adaptive HTTP streaming model, the HTTP server stores multiple representations of media content, encoded at different rates. It is the function of a streaming client to select and retrieve segments of appropriate representations to enable continuous media playback under varying network conditions. In this paper we describe design of a control mechanism enabling such a selection and retrieval of media data during streaming session. We also describe the architecture of a streaming client for adaptive HTTP streaming and provide simulation data illustrating the effectiveness of the proposed control mechanism for handling bandwidth fluctuations typical for TCP traffic.

  5. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  6. Design of an adaptive neural network based power system stabilizer.

    PubMed

    Liu, Wenxin; Venayagamoorthy, Ganesh K; Wunsch, Donald C

    2003-01-01

    Power system stabilizers (PSS) are used to generate supplementary control signals for the excitation system in order to damp the low frequency power system oscillations. To overcome the drawbacks of conventional PSS (CPSS), numerous techniques have been proposed in the literature. Based on the analysis of existing techniques, this paper presents an indirect adaptive neural network based power system stabilizer (IDNC) design. The proposed IDNC consists of a neuro-controller, which is used to generate a supplementary control signal to the excitation system, and a neuro-identifier, which is used to model the dynamics of the power system and to adapt the neuro-controller parameters. The proposed method has the features of a simple structure, adaptivity and fast response. The proposed IDNC is evaluated on a single machine infinite bus power system under different operating conditions and disturbances to demonstrate its effectiveness and robustness. PMID:12850048

  7. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    NASA Astrophysics Data System (ADS)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  8. Design of Adaptive Policy Pathways under Deep Uncertainties

    NASA Astrophysics Data System (ADS)

    Babovic, Vladan

    2013-04-01

    The design of large-scale engineering and infrastructural systems today is growing in complexity. Designers need to consider sociotechnical uncertainties, intricacies, and processes in the long- term strategic deployment and operations of these systems. In this context, water and spatial management is increasingly challenged not only by climate-associated changes such as sea level rise and increased spatio-temporal variability of precipitation, but also by pressures due to population growth and particularly accelerating rate of urbanisation. Furthermore, high investment costs and long term-nature of water-related infrastructure projects requires long-term planning perspective, sometimes extending over many decades. Adaptation to such changes is not only determined by what is known or anticipated at present, but also by what will be experienced and learned as the future unfolds, as well as by policy responses to social and water events. As a result, a pathway emerges. Instead of responding to 'surprises' and making decisions on ad hoc basis, exploring adaptation pathways into the future provide indispensable support in water management decision-making. In this contribution, a structured approach for designing a dynamic adaptive policy based on the concepts of adaptive policy making and adaptation pathways is introduced. Such an approach provides flexibility which allows change over time in response to how the future unfolds, what is learned about the system, and changes in societal preferences. The introduced flexibility provides means for dealing with complexities of adaptation under deep uncertainties. It enables engineering systems to change in the face of uncertainty to reduce impacts from downside scenarios while capitalizing on upside opportunities. This contribution presents comprehensive framework for development and deployment of adaptive policy pathway framework, and demonstrates its performance under deep uncertainties on a case study related to urban

  9. Adaptive Designs for Randomized Trials in Public Health

    PubMed Central

    Brown, C. Hendricks; Have, Thomas R. Ten; Jo, Booil; Dagne, Getachew; Wyman, Peter A.; Muthén, Bengt; Gibbons, Robert D.

    2009-01-01

    In this article, we present a discussion of two general ways in which the traditional randomized trial can be modified or adapted in response to the data being collected. We use the term adaptive design to refer to a trial in which characteristics of the study itself, such as the proportion assigned to active intervention versus control, change during the trial in response to data being collected. The term adaptive sequence of trials refers to a decision-making process that fundamentally informs the conceptualization and conduct of each new trial with the results of previous trials. Our discussion below investigates the utility of these two types of adaptations for public health evaluations. Examples are provided to illustrate how adaptation can be used in practice. From these case studies, we discuss whether such evaluations can or should be analyzed as if they were formal randomized trials, and we discuss practical as well as ethical issues arising in the conduct of these new-generation trials. PMID:19296774

  10. Self-Adaptive Stepsize Search Applied to Optimal Structural Design

    NASA Astrophysics Data System (ADS)

    Nolle, L.; Bland, J. A.

    Structural engineering often involves the design of space frames that are required to resist predefined external forces without exhibiting plastic deformation. The weight of the structure and hence the weight of its constituent members has to be as low as possible for economical reasons without violating any of the load constraints. Design spaces are usually vast and the computational costs for analyzing a single design are usually high. Therefore, not every possible design can be evaluated for real-world problems. In this work, a standard structural design problem, the 25-bar problem, has been solved using self-adaptive stepsize search (SASS), a relatively new search heuristic. This algorithm has only one control parameter and therefore overcomes the drawback of modern search heuristics, i.e. the need to first find a set of optimum control parameter settings for the problem at hand. In this work, SASS outperforms simulated-annealing, genetic algorithms, tabu search and ant colony optimization.

  11. Towards automated on-line adaptation of 2-Step IMRT plans: QUASIMODO phantom and prostate cancer cases

    PubMed Central

    2013-01-01

    ) 2-Step generation for the geometry of the day using the relocated isocenter, MU transfer from the planning geometry; 2) Adaptation of the widths of S2 segments to the geometry of the day; 3) Imitation of DMPO fine-tuning for the geometry of the day. Results and conclusion We have performed automated 2-Step IMRT adaptation for ten prostate adaptation cases. The adapted plans show statistically significant improvement of the target coverage and of the rectum sparing compared to those plans in which only the isocenter is relocated. The 2-Step IMRT method may become a core of the automated adaptive radiation therapy system at our department. PMID:24207129

  12. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  13. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis.

    PubMed

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior-in a unified fashion-to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts

  14. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis

    PubMed Central

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior—in a unified fashion—to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial

  15. Integrated System Design: Promoting the Capacity of Sociotechnical Systems for Adaptation through Extensions of Cognitive Work Analysis.

    PubMed

    Naikar, Neelam; Elix, Ben

    2016-01-01

    This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior-in a unified fashion-to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts.

  16. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  17. Development of adapted GMR-probes for automated detection of hidden defects in thin steel sheets

    NASA Astrophysics Data System (ADS)

    Pelkner, Matthias; Pohl, Rainer; Kreutzbruck, Marc; Commandeur, Colin

    2016-02-01

    Thin steel sheets with a thickness of 0.3 mm and less are the base materials of many everyday life products (cans, batteries, etc.). Potential inhomogeneities such as non-metallic inclusions inside the steel can lead to a rupture of the sheets when it is formed into a product such as a beverage can. Therefore, there is a need to develop automated NDT techniques to detect hidden defects and inclusions in thin sheets during production. For this purpose Tata Steel Europe and BAM, the Federal Institute for Materials Research and Testing (Germany), collaborate in order to develop an automated NDT-system. Defect detection systems have to be robust against external influences, especially when used in an industrial environment. In addition, such a facility has to achieve a high sensitivity and a high spatial resolution in terms of detecting small inclusions in the μm-regime. In a first step, we carried out a feasibility study to determine which testing method is promising for detecting hidden defects and inclusions inside ferrous thin steel sheets. Therefore, two methods were investigated in more detail - magnetic flux leakage testing (MFL) using giant magneto resistance sensor arrays (GMR) as receivers [1,2] and eddy current testing (ET). The capabilities of both methods were tested with 0.2 mm-thick steel samples containing small defects with depths ranging from 5 µm up to 60 µm. Only in case of GMR-MFL-testing, we were able to detect parts of the hidden defects with a depth of 10 µm trustworthily with a SNR better than 10 dB. Here, the lift off between sensor and surface was 250 µm. On this basis, we investigated different testing scenarios including velocity tests and different lift offs. In this contribution we present the results of the feasibility study leading to first prototypes of GMR-probes which are now installed as part of a demonstrator inside a production line.

  18. Automated structural design with aeroelastic constraints - A review and assessment of the state of the art

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.

    1974-01-01

    A review and assessment of the state of the art in automated aeroelastic design is presented. Most of the aeroelastic design studies appearing in the literature deal with flutter, and, therefore, this paper also concentrates on flutter. The flutter design problem is divided into three cases: as isolated flutter mode, neighboring flutter modes, and a hump mode which can rise and cause a sudden, discontinuous change in the flutter velocity. Synthesis procedures are presented in terms of techniques that are appropriate for problems of various levels of difficulty. Current trends, which should result in more efficient, powerful and versatile design codes, are discussed. Approximate analysis procedures and the need for simultaneous consideration of multiple design requirements are emphasized.

  19. Launch vehicle payload adapter design with vibration isolation features

    NASA Astrophysics Data System (ADS)

    Thomas, Gareth R.; Fadick, Cynthia M.; Fram, Bryan J.

    2005-05-01

    Payloads, such as satellites or spacecraft, which are mounted on launch vehicles, are subject to severe vibrations during flight. These vibrations are induced by multiple sources that occur between liftoff and the instant of final separation from the launch vehicle. A direct result of the severe vibrations is that fatigue damage and failure can be incurred by sensitive payload components. For this reason a payload adapter has been designed with special emphasis on its vibration isolation characteristics. The design consists of an annular plate that has top and bottom face sheets separated by radial ribs and close-out rings. These components are manufactured from graphite epoxy composites to ensure a high stiffness to weight ratio. The design is tuned to keep the frequency of the axial mode of vibration of the payload on the flexibility of the adapter to a low value. This is the main strategy adopted for isolating the payload from damaging vibrations in the intermediate to higher frequency range (45Hz-200Hz). A design challenge for this type of adapter is to keep the pitch frequency of the payload above a critical value in order to avoid dynamic interactions with the launch vehicle control system. This high frequency requirement conflicts with the low axial mode frequency requirement and this problem is overcome by innovative tuning of the directional stiffnesses of the composite parts. A second design strategy that is utilized to achieve good isolation characteristics is the use of constrained layer damping. This feature is particularly effective at keeping the responses to a minimum for one of the most important dynamic loading mechanisms. This mechanism consists of the almost-tonal vibratory load associated with the resonant burn condition present in any stage powered by a solid rocket motor. The frequency of such a load typically falls in the 45-75Hz range and this phenomenon drives the low frequency design of the adapter. Detailed finite element analysis is

  20. Future planning and evaluation for automated adaptive minehunting: a roadmap for mine countermeasures theory modernization

    NASA Astrophysics Data System (ADS)

    Garcia, Gregory A.; Wettergren, Thomas A.

    2012-06-01

    This paper presents a discussion of U.S. naval mine countermeasures (MCM) theory modernization in light of advances in the areas of autonomy, tactics, and sensor processing. The unifying theme spanning these research areas concerns the capability for in situ adaptation of processing algorithms, plans, and vehicle behaviors enabled through run-time situation assessment and performance estimation. Independently, each of these technology developments impact the MCM Measures of Effectiveness1 [MOE(s)] of time and risk by improving one or more associated Measures of Performance2 [MOP(s)]; the contribution of this paper is to outline an integrated strategy for realizing the cumulative benefits of these technology enablers to the United States Navy's minehunting capability. An introduction to the MCM problem is provided to frame the importance of the foundational research and the ramifications of the proposed strategy on the MIW community. We then include an overview of current and future adaptive capability research in the aforementioned areas, highlighting a departure from the existing rigid assumption-based approaches while identifying anticipated technology acceptance issues. Consequently, the paper describes an incremental strategy for transitioning from the current minehunting paradigm where tactical decision aids rely on a priori intelligence and there is little to no in situ adaptation or feedback to a future vision where unmanned systems3, equipped with a representation of the commander's intent, are afforded the authority and ability to adapt to environmental perturbations with minimal human-in-the-loop supervision. The discussion concludes with an articulation of the science and technology issues which the MCM research community must continue to address.

  1. The VIADUC project: innovation in climate adaptation through service design

    NASA Astrophysics Data System (ADS)

    Corre, L.; Dandin, P.; L'Hôte, D.; Besson, F.

    2015-07-01

    From the French National Adaptation to Climate Change Plan, the "Drias, les futurs du climat" service has been developed to provide easy access to French regional climate projections. This is a major step for the implementation of French Climate Services. The usefulness of this service for the end-users and decision makers involved with adaptation planning at a local scale is investigated. As such, the VIADUC project is: to evaluate and enhance Drias, as well as to imagine future development in support of adaptation. Climate scientists work together with end-users and a service designer. The designer's role is to propose an innovative approach based on the interaction between scientists and citizens. The chosen end-users are three Natural Regional Parks located in the South West of France. The latter parks are administrative entities which gather municipalities having a common natural and cultural heritage. They are also rural areas in which specific economic activities take place, and therefore are concerned and involved in both protecting their environment and setting-up sustainable economic development. The first year of the project has been dedicated to investigation including the questioning of relevant representatives. Three key local economic sectors have been selected: i.e. forestry, pastoral farming and building activities. Working groups were composed of technicians, administrative and maintenance staff, policy makers and climate researchers. The sectors' needs for climate information have been assessed. The lessons learned led to actions which are presented hereinafter.

  2. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    NASA Technical Reports Server (NTRS)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  3. Effects of an Advanced Reactor’s Design, Use of Automation, and Mission on Human Operators

    SciTech Connect

    Jeffrey C. Joe; Johanna H. Oxstrand

    2014-06-01

    The roles, functions, and tasks of the human operator in existing light water nuclear power plants (NPPs) are based on sound nuclear and human factors engineering (HFE) principles, are well defined by the plant’s conduct of operations, and have been validated by years of operating experience. However, advanced NPPs whose engineering designs differ from existing light-water reactors (LWRs) will impose changes on the roles, functions, and tasks of the human operators. The plans to increase the use of automation, reduce staffing levels, and add to the mission of these advanced NPPs will also affect the operator’s roles, functions, and tasks. We assert that these factors, which do not appear to have received a lot of attention by the design engineers of advanced NPPs relative to the attention given to conceptual design of these reactors, can have significant risk implications for the operators and overall plant safety if not mitigated appropriately. This paper presents a high-level analysis of a specific advanced NPP and how its engineered design, its plan to use greater levels of automation, and its expanded mission have risk significant implications on operator performance and overall plant safety.

  4. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  5. CRISPRmap: an automated classification of repeat conservation in prokaryotic adaptive immune systems.

    PubMed

    Lange, Sita J; Alkhnbashi, Omer S; Rose, Dominic; Will, Sebastian; Backofen, Rolf

    2013-09-01

    Central to Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-Cas systems are repeated RNA sequences that serve as Cas-protein-binding templates. Classification is based on the architectural composition of associated Cas proteins, considering repeat evolution is essential to complete the picture. We compiled the largest data set of CRISPRs to date, performed comprehensive, independent clustering analyses and identified a novel set of 40 conserved sequence families and 33 potential structure motifs for Cas-endoribonucleases with some distinct conservation patterns. Evolutionary relationships are presented as a hierarchical map of sequence and structure similarities for both a quick and detailed insight into the diversity of CRISPR-Cas systems. In a comparison with Cas-subtypes, I-C, I-E, I-F and type II were strongly coupled and the remaining type I and type III subtypes were loosely coupled to repeat and Cas1 evolution, respectively. Subtypes with a strong link to CRISPR evolution were almost exclusive to bacteria; nevertheless, we identified rare examples of potential horizontal transfer of I-C and I-E systems into archaeal organisms. Our easy-to-use web server provides an automated assignment of newly sequenced CRISPRs to our classification system and enables more informed choices on future hypotheses in CRISPR-Cas research: http://rna.informatik.uni-freiburg.de/CRISPRmap.

  6. CRISPRmap: an automated classification of repeat conservation in prokaryotic adaptive immune systems

    PubMed Central

    Lange, Sita J.; Alkhnbashi, Omer S.; Rose, Dominic; Will, Sebastian; Backofen, Rolf

    2013-01-01

    Central to Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-Cas systems are repeated RNA sequences that serve as Cas-protein–binding templates. Classification is based on the architectural composition of associated Cas proteins, considering repeat evolution is essential to complete the picture. We compiled the largest data set of CRISPRs to date, performed comprehensive, independent clustering analyses and identified a novel set of 40 conserved sequence families and 33 potential structure motifs for Cas-endoribonucleases with some distinct conservation patterns. Evolutionary relationships are presented as a hierarchical map of sequence and structure similarities for both a quick and detailed insight into the diversity of CRISPR-Cas systems. In a comparison with Cas-subtypes, I-C, I-E, I-F and type II were strongly coupled and the remaining type I and type III subtypes were loosely coupled to repeat and Cas1 evolution, respectively. Subtypes with a strong link to CRISPR evolution were almost exclusive to bacteria; nevertheless, we identified rare examples of potential horizontal transfer of I-C and I-E systems into archaeal organisms. Our easy-to-use web server provides an automated assignment of newly sequenced CRISPRs to our classification system and enables more informed choices on future hypotheses in CRISPR-Cas research: http://rna.informatik.uni-freiburg.de/CRISPRmap. PMID:23863837

  7. CRISPRmap: an automated classification of repeat conservation in prokaryotic adaptive immune systems.

    PubMed

    Lange, Sita J; Alkhnbashi, Omer S; Rose, Dominic; Will, Sebastian; Backofen, Rolf

    2013-09-01

    Central to Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR)-Cas systems are repeated RNA sequences that serve as Cas-protein-binding templates. Classification is based on the architectural composition of associated Cas proteins, considering repeat evolution is essential to complete the picture. We compiled the largest data set of CRISPRs to date, performed comprehensive, independent clustering analyses and identified a novel set of 40 conserved sequence families and 33 potential structure motifs for Cas-endoribonucleases with some distinct conservation patterns. Evolutionary relationships are presented as a hierarchical map of sequence and structure similarities for both a quick and detailed insight into the diversity of CRISPR-Cas systems. In a comparison with Cas-subtypes, I-C, I-E, I-F and type II were strongly coupled and the remaining type I and type III subtypes were loosely coupled to repeat and Cas1 evolution, respectively. Subtypes with a strong link to CRISPR evolution were almost exclusive to bacteria; nevertheless, we identified rare examples of potential horizontal transfer of I-C and I-E systems into archaeal organisms. Our easy-to-use web server provides an automated assignment of newly sequenced CRISPRs to our classification system and enables more informed choices on future hypotheses in CRISPR-Cas research: http://rna.informatik.uni-freiburg.de/CRISPRmap. PMID:23863837

  8. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    ERIC Educational Resources Information Center

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  9. Reflections on the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) Process—Findings from a Qualitative Study

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.

    2015-01-01

    Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163

  10. ECO fill: automated fill modification to support late-stage design changes

    NASA Astrophysics Data System (ADS)

    Davis, Greg; Wilson, Jeff; Yu, J. J.; Chiu, Anderson; Chuang, Yao-Jen; Yang, Ricky

    2014-03-01

    One of the most critical factors in achieving a positive return for a design is ensuring the design not only meets performance specifications, but also produces sufficient yield to meet the market demand. The goal of design for manufacturability (DFM) technology is to enable designers to address manufacturing requirements during the design process. While new cell-based, DP-aware, and net-aware fill technologies have emerged to provide the designer with automated fill engines that support these new fill requirements, design changes that arrive late in the tapeout process (as engineering change orders, or ECOs) can have a disproportionate effect on tapeout schedules, due to the complexity of replacing fill. If not handled effectively, the impacts on file size, run time, and timing closure can significantly extend the tapeout process. In this paper, the authors examine changes to design flow methodology, supported by new fill technology, that enable efficient, fast, and accurate adjustments to metal fill late in the design process. We present an ECO fill methodology coupled with the support of advanced fill tools that can quickly locate the portion of the design affected by the change, remove and replace only the fill in that area, while maintaining the fill hierarchy. This new fill approach effectively reduces run time, contains fill file size, minimizes timing impact, and minimizes mask costs due to ECO-driven fill changes, all of which are critical factors to ensuring time-to-market schedules are maintained.

  11. Validation of the NetSCID: an automated web-based adaptive version of the SCID.

    PubMed

    Brodey, Benjamin B; First, Michael; Linthicum, Jared; Haman, Kirsten; Sasiela, Jordan W; Ayer, David

    2016-04-01

    The present study developed and validated a configurable, adaptive, web-based version of the Structured Clinical Interview for DSM, the NetSCID. The validation included 24 clinicians who administered the SCID and 230 participants who completed the paper SCID and/or the NetSCID. Data-entry errors, branching errors, and clinician satisfaction were quantified. Relative to the paper SCID, the NetSCID resulted in far fewer data-entry and branching errors. Clinicians 'preferred' using the NetSCID and found that the NetSCID was easier to administer. PMID:26995238

  12. An automated multi-modal object analysis approach to coronary calcium scoring of adaptive heart isolated MSCT images

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-02-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. This can be challenging for a human observer as it is difficult to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. The inclusion or exclusion of false positive or true positive calcified plaques respectively will alter the patient calcium score incorrectly, thus leading to the possibility of incorrect treatment prescription. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the Volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the requirement and

  13. Automated registration of large deformations for adaptive radiation therapy of prostate cancer

    SciTech Connect

    Godley, Andrew; Ahunbay, Ergun; Peng Cheng; Li, X. Allen

    2009-04-15

    Available deformable registration methods are often inaccurate over large organ variation encountered, for example, in the rectum and bladder. The authors developed a novel approach to accurately and effectively register large deformations in the prostate region for adaptive radiation therapy. A software tool combining a fast symmetric demons algorithm and the use of masks was developed in C++ based on ITK libraries to register CT images acquired at planning and before treatment fractions. The deformation field determined was subsequently used to deform the delivered dose to match the anatomy of the planning CT. The large deformations involved required that the bladder and rectum volume be masked with uniform intensities of -1000 and 1000 HU, respectively, in both the planning and treatment CTs. The tool was tested for five prostate IGRT patients. The average rectum planning to treatment contour overlap improved from 67% to 93%, the lowest initial overlap is 43%. The average bladder overlap improved from 83% to 98%, with a lowest initial overlap of 60%. Registration regions were set to include a volume receiving 4% of the maximum dose. The average region was 320x210x63, taking approximately 9 min to register on a dual 2.8 GHz Linux system. The prostate and seminal vesicles were correctly placed even though they are not masked. The accumulated doses for multiple fractions with large deformation were computed and verified. The tool developed can effectively supply the previously delivered dose for adaptive planning to correct for interfractional changes.

  14. Accelerated optimization and automated discovery with covariance matrix adaptation for experimental quantum control

    SciTech Connect

    Roslund, Jonathan; Shir, Ofer M.; Rabitz, Herschel; Baeck, Thomas

    2009-10-15

    Optimization of quantum systems by closed-loop adaptive pulse shaping offers a rich domain for the development and application of specialized evolutionary algorithms. Derandomized evolution strategies (DESs) are presented here as a robust class of optimizers for experimental quantum control. The combination of stochastic and quasi-local search embodied by these algorithms is especially amenable to the inherent topology of quantum control landscapes. Implementation of DES in the laboratory results in efficiency gains of up to {approx}9 times that of the standard genetic algorithm, and thus is a promising tool for optimization of unstable or fragile systems. The statistical learning upon which these algorithms are predicated also provide the means for obtaining a control problem's Hessian matrix with no additional experimental overhead. The forced optimal covariance adaptive learning (FOCAL) method is introduced to enable retrieval of the Hessian matrix, which can reveal information about the landscape's local structure and dynamic mechanism. Exploitation of such algorithms in quantum control experiments should enhance their efficiency and provide additional fundamental insights.

  15. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume.

  16. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. PMID:21485036

  17. Toward new design-rule-check of silicon photonics for automated layout physical verifications

    NASA Astrophysics Data System (ADS)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2015-02-01

    A simple analytical model is developed to estimate the power loss and time delay in photonic integrated circuits fabricated using SOI standard wafers. This model is simple and can be utilized in physical verification of the circuit layout to verify its feasibility for fabrication using certain foundry specifications. This model allows for providing new design rules for the layout physical verification process in any electronic design automation (EDA) tool. The model is accurate and compared with finite element based full wave electromagnetic EM solver. The model is closed form and circumvents the need to utilize any EM solver for verification process. As such it dramatically reduces the time of verification process and allows fast design rule check.

  18. Optimizing RF gun cavity geometry within an automated injector design system

    SciTech Connect

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  19. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  20. Optical Design for Extremely Large Telescope Adaptive Optics Systems

    SciTech Connect

    Bauman, B J

    2003-11-26

    Designing an adaptive optics (AO) system for extremely large telescopes (ELT's) will present new optical engineering challenges. Several of these challenges are addressed in this work, including first-order design of multi-conjugate adaptive optics (MCAO) systems, pyramid wavefront sensors (PWFS's), and laser guide star (LGS) spot elongation. MCAO systems need to be designed in consideration of various constraints, including deformable mirror size and correction height. The y,{bar y} method of first-order optical design is a graphical technique that uses a plot with marginal and chief ray heights as coordinates; the optical system is represented as a segmented line. This method is shown to be a powerful tool in designing MCAO systems. From these analyses, important conclusions about configurations are derived. PWFS's, which offer an alternative to Shack-Hartmann (SH) wavefront sensors (WFS's), are envisioned as the workhorse of layer-oriented adaptive optics. Current approaches use a 4-faceted glass pyramid to create a WFS analogous to a quad-cell SH WFS. PWFS's and SH WFS's are compared and some newly-considered similarities and PWFS advantages are presented. Techniques to extend PWFS's are offered: First, PWFS's can be extended to more pixels in the image by tiling pyramids contiguously. Second, pyramids, which are difficult to manufacture, can be replaced by less expensive lenslet arrays. An approach is outlined to convert existing SH WFS's to PWFS's for easy evaluation of PWFS's. Also, a demonstration of PWFS's in sensing varying amounts of an aberration is presented. For ELT's, the finite altitude and finite thickness of LGS's means that the LGS will appear elongated from the viewpoint of subapertures not directly under the telescope. Two techniques for dealing with LGS spot elongation in SH WFS's are presented. One method assumes that the laser will be pulsed and uses a segmented micro-electromechanical system (MEMS) to track the LGS light subaperture by

  1. Optical design of the adaptive optics laser guide star system

    SciTech Connect

    Bissinger, H.

    1994-11-15

    The design of an adaptive optics package for the 3 meter Lick telescope is presented. This instrument package includes a 69 actuator deformable mirror and a Hartmann type wavefront sensor operating in the visible wavelength; a quadrant detector for the tip-tile sensor and a tip-tilt mirror to stabilize atmospheric first order tip-tile errors. A high speed computer drives the deformable mirror to achieve near diffraction limited imagery. The different optical components and their individual design constraints are described. motorized stages and diagnostics tools are used to operate and maintain alignment throughout observation time from a remote control room. The expected performance are summarized and actual results of astronomical sources are presented.

  2. ZAP! Adapted: Incorporating design in the introductory electromagnetism lab

    NASA Astrophysics Data System (ADS)

    McNeil, J. A.

    2002-04-01

    In the last decade the Accreditation Board of Engineering and Technology(ABET) significantly reformed the criteria by which engineering programs are accredited. The new criteria are called Engineering Criteria 2000 (EC2000). Not surprisingly, engineering design constitutes an essential component of these criteria. The Engineering Physics program at the Colorado School of Mines (CSM) underwent an ABET general review and site visit in the fall of 2000. In preparation for this review and as part of a campus-wide curriculum reform the Physics Department was challenged to include elements of design in its introductory laboratories. As part of the background research for this reform, several laboratory programs were reviewed including traditional and studio modes as well as a course used by Cal Tech and MIT called "ZAP!" which incorporates design activities well-aligned with the EC2000 criteria but in a nontraditional delivery mode. CSM has adapted several ZAP! experiments to a traditional laboratory format while attempting to preserve significant design experiences. The new laboratory forms an important component of the reformed course which attempts to respect the psychological principles of learner-based education. This talk reviews the reformed introductory electromagnetism course and how the laboratories are integrated into the pedagogy along with design activities. In their new form the laboratories can be readily adopted by physics departments using traditional delivery formats.

  3. Feasibility of Automated Adaptive GCA (Ground Controlled Approach) Controller Training System.

    ERIC Educational Resources Information Center

    Feuge, Robert L.; And Others

    An analysis of the conceptual feasibility of using automatic speech recognition and understanding technology in the design of an advanced training system was conducted. The analysis specifically explored application to Ground Controlled Approach (GCA) controller training. A systems engineering approach was followed to determine the feasibility of…

  4. Design of infrasound-detection system via adaptive LMSTDE algorithm

    NASA Technical Reports Server (NTRS)

    Khalaf, C. S.; Stoughton, J. W.

    1984-01-01

    A proposed solution to an aviation safety problem is based on passive detection of turbulent weather phenomena through their infrasonic emission. This thesis describes a system design that is adequate for detection and bearing evaluation of infrasounds. An array of four sensors, with the appropriate hardware, is used for the detection part. Bearing evaluation is based on estimates of time delays between sensor outputs. The generalized cross correlation (GCC), as the conventional time-delay estimation (TDE) method, is first reviewed. An adaptive TDE approach, using the least mean square (LMS) algorithm, is then discussed. A comparison between the two techniques is made and the advantages of the adaptive approach are listed. The behavior of the GCC, as a Roth processor, is examined for the anticipated signals. It is shown that the Roth processor has the desired effect of sharpening the peak of the correlation function. It is also shown that the LMSTDE technique is an equivalent implementation of the Roth processor in the time domain. A LMSTDE lead-lag model, with a variable stability coefficient and a convergence criterion, is designed.

  5. Design of a motion JPEG (M/JPEG) adapter card

    NASA Astrophysics Data System (ADS)

    Lee, D. H.; Sudharsanan, Subramania I.

    1994-05-01

    In this paper we describe a design of a high performance JPEG (Joint Photographic Experts Group) Micro Channel adapter card. The card, tested on a range of PS/2 platforms (models 50 to 95), can complete JPEG operations on a 640 by 240 pixel image within 1/60 of a second, thus enabling real-time capture and display of high quality digital video. The card accepts digital pixels for either a YUV 4:2:2 or an RGB 4:4:4 pixel bus and has been shown to handle up to 2.05 MBytes/second of compressed data. The compressed data is transmitted to a host memory area by Direct Memory Access operations. The card uses a single C-Cube's CL550 JPEG processor that complies with the baseline JPEG. We give broad descriptions of the hardware that controls the video interface, CL550, and the system interface. Some critical design points that enhance the overall performance of the M/JPEG systems are pointed out. The control of the adapter card is achieved by an interrupt driven software that runs under DOS. The software performs a variety of tasks that include change of color space (RGB or YUV), change of quantization and Huffman tables, odd and even field control and some diagnostic operations.

  6. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    PubMed

    Churchill, Nathan W; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline") significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  7. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI

    PubMed Central

    Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets. PMID:26161667

  8. Robotic automation for space: planetary surface exploration, terrain-adaptive mobility, and multirobot cooperative tasks

    NASA Astrophysics Data System (ADS)

    Schenker, Paul S.; Huntsberger, Terrance L.; Pirjanian, Paolo; Baumgartner, Eric T.; Aghazarian, Hrand; Trebi-Ollennu, Ashitey; Leger, Patrick C.; Cheng, Yang; Backes, Paul G.; Tunstel, Edward; Dubowsky, Steven; Iagnemma, Karl D.; McKee, Gerard T.

    2001-10-01

    During the last decade, there has been significant progress toward a supervised autonomous robotic capability for remotely controlled scientific exploration of planetary surfaces. While planetary exploration potentially encompasses many elements ranging from orbital remote sensing to subsurface drilling, the surface robotics element is particularly important to advancing in situ science objectives. Surface activities include a direct characterization of geology, mineralogy, atmosphere and other descriptors of current and historical planetary processes-and ultimately-the return of pristine samples to Earth for detailed analysis. Toward these ends, we have conducted a broad program of research on robotic systems for scientific exploration of the Mars surface, with minimal remote intervention. The goal is to enable high productivity semi-autonomous science operations where available mission time is concentrated on robotic operations, rather than up-and-down-link delays. Results of our work include prototypes for landed manipulators, long-ranging science rovers, sampling/sample return mobility systems, and more recently, terrain-adaptive reconfigurable/modular robots and closely cooperating multiple rover systems. The last of these are intended to facilitate deployment of planetary robotic outposts for an eventual human-robot sustained scientific presence. We overview our progress in these related areas of planetary robotics R&D, spanning 1995-to-present.

  9. A Non-antisymmetric Tensor Contraction Engine for the Automated Implementation of Spin-Adapted Coupled Cluster Approaches.

    PubMed

    Datta, Dipayan; Gauss, Jürgen

    2013-06-11

    We present a symbolic manipulation algorithm for the efficient automated implementation of rigorously spin-free coupled cluster (CC) theories based on a unitary group parametrization. Due to the lack of antisymmetry of the unitary group generators under index permutations, all quantities involved in the equations are expressed in terms of non-antisymmetric tensors. Given two tensors, all possible contractions are first generated by applying Wick's theorem. Each term is then put down in the form of a non-antisymmetric Goldstone diagram by assigning its contraction topology. The subsequent simplification of the equations by summing up equivalent terms and their factorization by identifying common intermediates is performed via comparison of these contraction topologies. The definition of the contraction topology is completely general for non-antisymmetric Goldstone diagrams, which enables our algorithm to deal with noncommuting excitations in the cluster operator that arises in the unitary group based CC formulation for open-shell systems. The resulting equations are implemented in a new code, in which tensor contractions are performed by successive application of matrix-matrix multiplications. Implementation of the unitary group adapted CC equations for closed-shell systems and for the simplest open-shell case, i.e., doublets, is discussed, and representative calculations are presented in order to assess the efficiency of the generated codes.

  10. Automated design of gravity-assist trajectories to Mars and the outer planets

    NASA Technical Reports Server (NTRS)

    Longuski, James M.; Williams, Steve N.

    1991-01-01

    In this paper, a new approach to planetary mission design is described which automates the search for gravity-assist trajectories. This method finds all conic solutions given a range of launch dates, a range of launch energies and a set of target planets. The new design tool is applied to the problems of finding multiple encounter trajectories to the outer planets and Venus gravity-assist trajectories to Mars. The last four-planet grand tour opportunity (until the year 2153) is identified. It requires an earth launch in 1996 and encounters Jupiter, Uranus, Neptune, and Pluto. Venus gravity-assist trajectories to Mars for the 30 year period 1995-2024 are examined. It is shown that in many cases these trajectories require less launch energy to reach Mars than direct ballistic trajectories.

  11. Design and Implementation of a Medication Reconciliation Kiosk: the Automated Patient History Intake Device (APHID)

    PubMed Central

    Lesselroth, Blake J.; Felder, Robert S.; Adams, Shawn M.; Cauthers, Phillip D.; Dorr, David A.; Wong, Gordon J.; Douglas, David M.

    2009-01-01

    Errors associated with medication documentation account for a substantial fraction of preventable medical errors. Hence, the Joint Commission has called for the adoption of reconciliation strategies at all United States healthcare institutions. Although studies suggest that reconciliation tools can reduce errors, it remains unclear how best to implement systems and processes that are reliable and sensitive to clinical workflow. The authors designed a primary care process that supported reconciliation without compromising clinic efficiency. This manuscript describes the design and implementation of Automated Patient History Intake Device (APHID): ambulatory check-in kiosks that allow patients to review the names, dosage, frequency, and pictures of their medications before their appointment. Medication lists are retrieved from the electronic health record and patient updates are captured and reviewed by providers during the clinic session. Results from the roll-in phase indicate the device is easy for patients to use and integrates well with clinic workflow. PMID:19261949

  12. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  13. ADS: A FORTRAN program for automated design synthesis: Version 1.10

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1985-01-01

    A new general-purpose optimization program for engineering design is described. ADS (Automated Design Synthesis - Version 1.10) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels: strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples, and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired.

  14. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  15. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2013-01-08

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  16. Automated solar collector installation design including ability to define heterogeneous design preferences

    DOEpatents

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-04-29

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre -defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives. Embodiments may also include definition of one or more design apertures, each of which may correspond to boundaries in which solar collector layouts should comply with distinct sets of user-defined design preferences. Distinct apertures may provide heterogeneous regions of collector layout according to the user-defined design preferences.

  17. Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate

    NASA Astrophysics Data System (ADS)

    Samaras, C.; Cook, L.

    2015-12-01

    Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.

  18. Microsystem design framework based on tool adaptations and library developments

    NASA Astrophysics Data System (ADS)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  19. Using dual-energy x-ray imaging to enhance automated lung tumor tracking during real-time adaptive radiotherapy

    SciTech Connect

    Menten, Martin J. Fast, Martin F.; Nill, Simeon; Oelfke, Uwe

    2015-12-15

    Purpose: Real-time, markerless localization of lung tumors with kV imaging is often inhibited by ribs obscuring the tumor and poor soft-tissue contrast. This study investigates the use of dual-energy imaging, which can generate radiographs with reduced bone visibility, to enhance automated lung tumor tracking for real-time adaptive radiotherapy. Methods: kV images of an anthropomorphic breathing chest phantom were experimentally acquired and radiographs of actual lung cancer patients were Monte-Carlo-simulated at three imaging settings: low-energy (70 kVp, 1.5 mAs), high-energy (140 kVp, 2.5 mAs, 1 mm additional tin filtration), and clinical (120 kVp, 0.25 mAs). Regular dual-energy images were calculated by weighted logarithmic subtraction of high- and low-energy images and filter-free dual-energy images were generated from clinical and low-energy radiographs. The weighting factor to calculate the dual-energy images was determined by means of a novel objective score. The usefulness of dual-energy imaging for real-time tracking with an automated template matching algorithm was investigated. Results: Regular dual-energy imaging was able to increase tracking accuracy in left–right images of the anthropomorphic phantom as well as in 7 out of 24 investigated patient cases. Tracking accuracy remained comparable in three cases and decreased in five cases. Filter-free dual-energy imaging was only able to increase accuracy in 2 out of 24 cases. In four cases no change in accuracy was observed and tracking accuracy worsened in nine cases. In 9 out of 24 cases, it was not possible to define a tracking template due to poor soft-tissue contrast regardless of input images. The mean localization errors using clinical, regular dual-energy, and filter-free dual-energy radiographs were 3.85, 3.32, and 5.24 mm, respectively. Tracking success was dependent on tumor position, tumor size, imaging beam angle, and patient size. Conclusions: This study has highlighted the influence of

  20. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  1. Designs and concept reliance of a fully automated high-content screening platform.

    PubMed

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world.

  2. From consent to institutions: designing adaptive governance for genomic biobanks.

    PubMed

    O'Doherty, Kieran C; Burgess, Michael M; Edwards, Kelly; Gallagher, Richard P; Hawkins, Alice K; Kaye, Jane; McCaffrey, Veronica; Winickoff, David E

    2011-08-01

    Biobanks are increasingly hailed as powerful tools to advance health research. The social and ethical challenges associated with the implementation and operation of biobanks are equally well-documented. One of the proposed solutions to these challenges involves trading off a reduction in the specificity of informed consent protocols with an increased emphasis on governance. However, little work has gone into formulating what such governance might look like. In this paper, we suggest four general principles that should inform biobank governance and illustrate the enactment of these principles in a proposed governance model for a particular population-scale biobank, the British Columbia (BC) Generations Project. We begin by outlining four principles that we see as necessary for informing sustainable and effective governance of biobanks: (1) recognition of research participants and publics as a collective body, (2) trustworthiness, (3) adaptive management, and (4) fit between the nature of a particular biobank and the specific structural elements of governance adopted. Using the BC Generations Project as a case study, we then offer as a working model for further discussion the outlines of a proposed governance structure enacting these principles. Ultimately, our goal is to design an adaptive governance approach that can protect participant interests as well as promote effective translational health sciences. PMID:21726926

  3. Wireless thermal sensor network with adaptive low power design.

    PubMed

    Lee, Ho-Yin; Chen, Shih-Lun; Chen, Chiung-An; Huang, Hong-Yi; Luo, Ching-Hsing

    2007-01-01

    There is an increasing need to develop flexible, reconfigurable, and intelligent low power wireless sensor network (WSN) system for healthcare applications. Technical advancements in micro-sensors, MEMS devices, low power electronics, and radio frequency circuits have enabled the design and development of such highly integrated system. In this paper, we present our proposed wireless thermal sensor network system, which is separated into control and data paths. Both of these paths have their own transmission frequencies. The control path sends the power and function commands from computer to each sensor elements by 2.4GHz RF circuits and the data path transmits measured data by 2.4GHz in sensor layer and 60GHz in higher layers. This hierarchy architecture would make reconfigurable mapping and pipeline applications on WSN possibly, and the average power consumption can be efficiently reduced about 60% by using the adaptive technique. PMID:18003354

  4. Decentralized adaptive control designs and microstrip antennas for smart structures

    NASA Astrophysics Data System (ADS)

    Khorrami, Farshad; Jain, Sandeep; Das, Nirod K.

    1996-05-01

    Smart structures lend themselves naturally to a decentralized control design framework, especially with adaptation mechanisms. The main reason being that it is highly undesirable to connect all the sensors and actuators in a large structure to a central processor. It is rather desirable to have local decision-making at each smart patch. Furthermore, this local controllers should be easily `expandable' to `contractible.' This corresponds to the fact that addition/deletion of several smart patches should not require a total redesign of the control system. The decentralized control strategies advocated in this paper are of expandable/contractible type. On another front, we are considering utilization of micro-strip antennas for power transfer to and from smart structures. We have made preliminary contributions in this direction and further developments are underway. These approaches are being pursued for active vibration damping and noise cancellation via piezoelectric ceramics although the methodology is general enough to be applicable to other type of active structures.

  5. Design, realization and structural testing of a compliant adaptable wing

    NASA Astrophysics Data System (ADS)

    Molinari, G.; Quack, M.; Arrieta, A. F.; Morari, M.; Ermanni, P.

    2015-10-01

    This paper presents the design, optimization, realization and testing of a novel wing morphing concept, based on distributed compliance structures, and actuated by piezoelectric elements. The adaptive wing features ribs with a selectively compliant inner structure, numerically optimized to achieve aerodynamically efficient shape changes while simultaneously withstanding aeroelastic loads. The static and dynamic aeroelastic behavior of the wing, and the effect of activating the actuators, is assessed by means of coupled 3D aerodynamic and structural simulations. To demonstrate the capabilities of the proposed morphing concept and optimization procedure, the wings of a model airplane are designed and manufactured according to the presented approach. The goal is to replace conventional ailerons, thus to achieve controllability in roll purely by morphing. The mechanical properties of the manufactured components are characterized experimentally, and used to create a refined and correlated finite element model. The overall stiffness, strength, and actuation capabilities are experimentally tested and successfully compared with the numerical prediction. To counteract the nonlinear hysteretic behavior of the piezoelectric actuators, a closed-loop controller is implemented, and its capability of accurately achieving the desired shape adaptation is evaluated experimentally. Using the correlated finite element model, the aeroelastic behavior of the manufactured wing is simulated, showing that the morphing concept can provide sufficient roll authority to allow controllability of the flight. The additional degrees of freedom offered by morphing can be also used to vary the plane lift coefficient, similarly to conventional flaps. The efficiency improvements offered by this technique are evaluated numerically, and compared to the performance of a rigid wing.

  6. Impacting patient outcomes through design: acuity adaptable care/universal room design.

    PubMed

    Brown, Katherine Kay; Gallant, Dennis

    2006-01-01

    To succeed in today's challenging healthcare environment, hospitals must examine their impact on customers--patients and families--staff and physicians. By using competitive facility design and incorporating evidence-based concepts such as the acuity adaptable care delivery model and the universal room, the hospital will realize an impact on patient satisfaction that will enhance market share, on physician satisfaction that will foster loyalty, and on staff satisfaction that will decrease turnover. At the same time, clinical outcomes such as a reduction in mortality and complications and efficiencies such as a reduction in length of stay and minimization of hospital costs through the elimination of transfers can be gained. The results achieved are dependent on the principles used in designing the patient room that should focus on maximizing patient safety and improving healing. This article will review key design elements that support the success of an acuity adaptable unit such as the use of a private room with zones dedicated to patients, families, and staff, healing environment, technology, and decentralized nursing stations that support the success of the acuity adaptable unit. Outcomes of institutions currently utilizing the acuity adaptable concept will be reviewed.

  7. Design of an automated algorithm for labeling cardiac blood pool in gated SPECT images of radiolabeled red blood cells

    SciTech Connect

    Hebert, T.J. |; Moore, W.H.; Dhekne, R.D.; Ford, P.V.; Wendt, J.A.; Murphy, P.H.; Ting, Y.

    1996-08-01

    The design of an automated computer algorithm for labeling the cardiac blood pool within gated 3-D reconstructions of the radiolabeled red blood cells is investigated. Due to patient functional abnormalities, limited resolution, and noise, certain spatial and temporal features of the cardiac blood pool that one would anticipate finding in every study are not present in certain frames or with certain patients. The labeling of the cardiac blood pool requires an algorithm that only relies upon features present in all patients. The authors investigate the design of a fully-automated region growing algorithm for this purpose.

  8. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    NASA Technical Reports Server (NTRS)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  9. Design and development of a microarray processing station (MPS) for automated miniaturized immunoassays.

    PubMed

    Pla-Roca, Mateu; Altay, Gizem; Giralt, Xavier; Casals, Alícia; Samitier, Josep

    2016-08-01

    Here we describe the design and evaluation of a fluidic device for the automatic processing of microarrays, called microarray processing station or MPS. The microarray processing station once installed on a commercial microarrayer allows automating the washing, and drying steps, which are often performed manually. The substrate where the assay occurs remains on place during the microarray printing, incubation and processing steps, therefore the addressing of nL volumes of the distinct immunoassay reagents such as capture and detection antibodies and samples can be performed on the same coordinate of the substrate with a perfect alignment without requiring any additional mechanical or optical re-alignment methods. This allows the performance of independent immunoassays in a single microarray spot. PMID:27405464

  10. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  11. Design and development of a PC based data acquisition system for automating thermal impact reporting

    SciTech Connect

    Garman, B.K.; Carter, P.B.; Davis, J.A. )

    1992-01-01

    The objective of this paper is to describe the design and development of an automated personal computer (PC) based data acquisition system for reporting the thermal impact of a fossil fueled power plant on its circulating water source. The system's prime functions are to collect and archive data and perform thermal hydraulic calculations necessary for reporting the plant's thermal impact on Waters of the United States to the Illinois Environmental Protection Agency (IEPA). The main objectives of the monitoring project were to reduce the labor required in the reporting process and to improve the accuracy in determining the circulating water flow rates through each of the station's three generating units. Additional efforts concentrated on enhancing condenser and circulating water pump performance information and providing an interface with the existing plant performance monitoring system.

  12. Design and implementation of an automated secondary cooling system for the continuous casting of billets.

    PubMed

    Chaudhuri, Subhasis; Singh, Rajeev Kumar; Patwari, Kuntal; Majumdar, Susanta; Ray, Asim Kumar; Singh, Arun Kumar Prasad; Neogi, Nirbhar

    2010-01-01

    This paper describes a heat transfer model based automatic secondary cooling system for a billet caster. The model aims to minimize the variation in surface temperature and excessive reheating of the billet strands. It is also used to avoid the low ductility trough of the solidifying steel, which aggravates the tendency of steel to crack. The system has been designed and implemented in an integrated steel plant. A Programmable Logic Controller (PLC) based automation system has been developed to control the water flow in the secondary cooling zones of the strand. The results obtained through field trials have shown complete elimination of internal and off-corner cracks for the fifty billet samples that were monitored.

  13. Research Initiatives and Preliminary Results In Automation Design In Airspace Management in Free Flight

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".

  14. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  15. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    NASA Astrophysics Data System (ADS)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  16. ADS: A FORTRAN program for automated design synthesis, version 1.00

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1984-01-01

    A new general-purpose optimization program for engineering design is described. ADS-1 (Automated Design Synthesis - Version 1) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels, being strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired. The program is demonstrated with a simple structural design example.

  17. Data fusion-based design for automated fingerprint identification systems (AFIS)

    NASA Astrophysics Data System (ADS)

    Reisman, James G.; Thomopoulos, Stelios C.

    1998-07-01

    This paper presents a data fusion-based approach to designing an Automated Fingerprint Identification System (AFIS). Fingerprint matching methods vary from pattern matching, using ridge structure, orientation, or even the entire fingerprint itself, to point critical matching, using localized features such as ridge discontinuities, e.g. minutiae, or porous structures. Localized matching methods, such as minutiae, tend to yield more compact templates, in general, than pattern based methods. However, the reliability of localized features may be an issue, since they are affected adversely by the quality of the captured fingerprint, i.e. the degree of noise. Minutiae-based matching methods tend to be slower, albeit more accurate, than pattern-based methods. The trade-off in designing a cost-effective AFIS in terms of processing power (CPU) used, matching speed, and accuracy, lies in the choice of the proper matching methods that are selected to optimize performance by maximizing the matching accuracy while minimizing the search time. In this paper we present a systematic design and study of a fusion-based AFIS using a multiplicity of matching methods to optimize system performance and minimize required CPU cost.

  18. Beyond the Design of Automated Writing Evaluation: Pedagogical Practices and Perceived Learning Effectiveness in EFL Writing Classes

    ERIC Educational Resources Information Center

    Chen, Chi-Fen Emily; Cheng, Wei-Yuan Eugene

    2008-01-01

    Automated writing evaluation (AWE) software is designed to provide instant computer-generated scores for a submitted essay along with diagnostic feedback. Most studies on AWE have been conducted on psychometric evaluations of its validity; however, studies on how effectively AWE is used in writing classes as a pedagogical tool are limited. This…

  19. Intensification of the Learning Process: Automated Instructional Resources Retrieval System. A Series of Reports Designed for Classroom Use.

    ERIC Educational Resources Information Center

    Bucks County Public Schools, Doylestown, PA.

    The problem of finding relevant material to answer a classroom need is the focus of this report. The Automated Instructional Resources Retrieval System (AIRR) is designed to assist teachers by storing information in a number of categories, including the following: media type, maturity level, length, producer or publisher, main curriculum area,…

  20. Designing for Change: Interoperability in a scaling and adapting environment

    NASA Astrophysics Data System (ADS)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  1. Optical Design and Optimization of Translational Reflective Adaptive Optics Ophthalmoscopes

    NASA Astrophysics Data System (ADS)

    Sulai, Yusufu N. B.

    The retina serves as the primary detector for the biological camera that is the eye. It is composed of numerous classes of neurons and support cells that work together to capture and process an image formed by the eye's optics, which is then transmitted to the brain. Loss of sight due to retinal or neuro-ophthalmic disease can prove devastating to one's quality of life, and the ability to examine the retina in vivo is invaluable in the early detection and monitoring of such diseases. Adaptive optics (AO) ophthalmoscopy is a promising diagnostic tool in early stages of development, still facing significant challenges before it can become a clinical tool. The work in this thesis is a collection of projects with the overarching goal of broadening the scope and applicability of this technology. We begin by providing an optical design approach for AO ophthalmoscopes that reduces the aberrations that degrade the performance of the AO correction. Next, we demonstrate how to further improve image resolution through the use of amplitude pupil apodization and non-common path aberration correction. This is followed by the development of a viewfinder which provides a larger field of view for retinal navigation. Finally, we conclude with the development of an innovative non-confocal light detection scheme which improves the non-invasive visualization of retinal vasculature and reveals the cone photoreceptor inner segments in healthy and diseased eyes.

  2. Accelerated search for materials with targeted properties by adaptive design

    PubMed Central

    Xue, Dezhen; Balachandran, Prasanna V.; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-01-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set. PMID:27079901

  3. Accelerated search for materials with targeted properties by adaptive design

    NASA Astrophysics Data System (ADS)

    Xue, Dezhen; Balachandran, Prasanna V.; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-04-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ~800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set.

  4. Accelerated search for materials with targeted properties by adaptive design.

    PubMed

    Xue, Dezhen; Balachandran, Prasanna V; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-01-01

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set. PMID:27079901

  5. Evidence for adaptive design in human gaze preference.

    PubMed

    Conway, C A; Jones, B C; DeBruine, L M; Little, A C

    2008-01-01

    Many studies have investigated the physical cues that influence face preferences. By contrast, relatively few studies have investigated the effects of facial cues to the direction and valence of others' social interest (i.e. gaze direction and facial expressions) on face preferences. Here we found that participants demonstrated stronger preferences for direct gaze when judging the attractiveness of happy faces than that of disgusted faces, and that this effect of expression on the strength of attraction to direct gaze was particularly pronounced for judgements of opposite-sex faces (study 1). By contrast, no such opposite-sex bias in preferences for direct gaze was observed when participants judged the same faces for likeability (study 2). Collectively, these findings for a context-sensitive opposite-sex bias in preferences for perceiver-directed smiles, but not perceiver-directed disgust, suggest gaze preference functions, at least in part, to facilitate efficient allocation of mating effort, and evince adaptive design in the perceptual mechanisms that underpin face preferences. PMID:17986435

  6. Accelerated search for materials with targeted properties by adaptive design.

    PubMed

    Xue, Dezhen; Balachandran, Prasanna V; Hogden, John; Theiler, James; Xue, Deqing; Lookman, Turab

    2016-04-15

    Finding new materials with targeted properties has traditionally been guided by intuition, and trial and error. With increasing chemical complexity, the combinatorial possibilities are too large for an Edisonian approach to be practical. Here we show how an adaptive design strategy, tightly coupled with experiments, can accelerate the discovery process by sequentially identifying the next experiments or calculations, to effectively navigate the complex search space. Our strategy uses inference and global optimization to balance the trade-off between exploitation and exploration of the search space. We demonstrate this by finding very low thermal hysteresis (ΔT) NiTi-based shape memory alloys, with Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 possessing the smallest ΔT (1.84 K). We synthesize and characterize 36 predicted compositions (9 feedback loops) from a potential space of ∼800,000 compositions. Of these, 14 had smaller ΔT than any of the 22 in the original data set.

  7. Design of a Tool Integrating Force Sensing With Automated Insertion in Cochlear Implantation

    PubMed Central

    Schurzig, Daniel; Labadie, Robert F.; Hussong, Andreas; Rau, Thomas S.; Webster, Robert J.

    2012-01-01

    The quality of hearing restored to a deaf patient by a cochlear implant in hearing preservation cochlear implant surgery (and possibly also in routine cochlear implant surgery) is believed to depend on preserving delicate cochlear membranes while accurately inserting an electrode array deep into the spiral cochlea. Membrane rupture forces, and possibly, other indicators of suboptimal placement, are below the threshold detectable by human hands, motivating a force sensing insertion tool. Furthermore, recent studies have shown significant variability in manual insertion forces and velocities that may explain some instances of imperfect placement. Toward addressing this, an automated insertion tool was recently developed by Hussong et al. By following the same insertion tool concept, in this paper, we present mechanical enhancements that improve the surgeon’s interface with the device and make it smaller and lighter. We also present electomechanical design of new components enabling integrated force sensing. The tool is designed to be sufficiently compact and light that it can be mounted to a microstereotactic frame for accurate image-guided preinsertion positioning. The new integrated force sensing system is capable of resolving forces as small as 0.005 N, and we provide experimental illustration of using forces to detect errors in electrode insertion. PMID:23482414

  8. Current Practice in Designing Training for Complex Skills: Implications for Design and Evaluation of ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Schuver-van Blanken, Marian J.; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training-Interactive Tools is a European project coordinated by the Dutch National Aerospace Laboratory. The aim of ADAPT[IT] is to create and validate an effective training design methodology, based on cognitive science and leading to the integration of advanced technologies, so that the…

  9. A Practical Approach for Integrating Automatically Designed Fixtures with Automated Assembly Planning

    SciTech Connect

    Calton, Terri L.; Peters, Ralph R.

    1999-07-20

    This paper presents a practical approach for integrating automatically designed fixtures with automated assembly planning. Product assembly problems vary widely; here the focus is on assemblies that are characterized by a single base part to which a number of smaller parts and subassemblies are attached. This method starts with three-dimension at CAD descriptions of an assembly whose assembly tasks require a fixture to hold the base part. It then combines algorithms that automatically design assembly pallets to hold the base part with algorithms that automatically generate assembly sequences. The designed fixtures rigidly constrain and locate the part, obey task constraints, are robust to part shape variations, are easy to load, and are economical to produce. The algorithm is guaranteed to find the global optimum solution that satisfies these and other pragmatic conditions. The assembly planner consists of four main elements: a user interface, a constraint system, a search engine, and an animation module. The planner expresses all constraints at a sequencing level, specifying orders and conditions on part mating operations in a number of ways. Fast replanning enables an interactive plan-view-constrain-replan cycle that aids in constrain discovery and documentation. The combined algorithms guarantee that the fixture will hold the base part without interfering with any of the assembly operations. This paper presents an overview of the planners, the integration approach, and the results of the integrated algorithms applied to several practical manufacturing problems. For these problems initial high-quality fixture designs and assembly sequences are generated in a matter of minutes with global optimum solutions identified in just over an hour.

  10. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  11. Biomarker-Guided Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Jorgensen, Andrea L; Kolamunnage-Dona, Ruwanthi

    2016-01-01

    Background Personalized medicine is a growing area of research which aims to tailor the treatment given to a patient according to one or more personal characteristics. These characteristics can be demographic such as age or gender, or biological such as a genetic or other biomarker. Prior to utilizing a patient’s biomarker information in clinical practice, robust testing in terms of analytical validity, clinical validity and clinical utility is necessary. A number of clinical trial designs have been proposed for testing a biomarker’s clinical utility, including Phase II and Phase III clinical trials which aim to test the effectiveness of a biomarker-guided approach to treatment; these designs can be broadly classified into adaptive and non-adaptive. While adaptive designs allow planned modifications based on accumulating information during a trial, non-adaptive designs are typically simpler but less flexible. Methods and Findings We have undertaken a comprehensive review of biomarker-guided adaptive trial designs proposed in the past decade. We have identified eight distinct biomarker-guided adaptive designs and nine variations from 107 studies. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. We have graphically displayed the current biomarker-guided adaptive trial designs and summarised the characteristics of each design. Conclusions Our in-depth overview provides future researchers with clarity in definition, methodology and terminology for biomarker-guided adaptive trial designs. PMID:26910238

  12. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  13. Adaptation Patterns as a Conceptual Tool for Designing the Adaptive Operation of CSCL Systems

    ERIC Educational Resources Information Center

    Karakostas, Anastasios; Demetriadis, Stavros

    2011-01-01

    While adaptive collaboration support has become the focus of increasingly intense research efforts in the CSCL domain, scarce, however, remain the research-based evidence on pedagogically useful ideas on what and how to adapt during the collaborative learning activity. Based principally on two studies, this work presents a compilation of…

  14. An adaptive optics imaging system designed for clinical use.

    PubMed

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R; Rossi, Ethan A

    2015-06-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2-3 arc minutes, (arcmin) 2) ~0.5-0.8 arcmin and, 3) ~0.05-0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3-5 arcmin, 2) ~0.7-1.1 arcmin and 3) ~0.07-0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing.

  15. An adaptive optics imaging system designed for clinical use.

    PubMed

    Zhang, Jie; Yang, Qiang; Saito, Kenichi; Nozato, Koji; Williams, David R; Rossi, Ethan A

    2015-06-01

    Here we demonstrate a new imaging system that addresses several major problems limiting the clinical utility of conventional adaptive optics scanning light ophthalmoscopy (AOSLO), including its small field of view (FOV), reliance on patient fixation for targeting imaging, and substantial post-processing time. We previously showed an efficient image based eye tracking method for real-time optical stabilization and image registration in AOSLO. However, in patients with poor fixation, eye motion causes the FOV to drift substantially, causing this approach to fail. We solve that problem here by tracking eye motion at multiple spatial scales simultaneously by optically and electronically integrating a wide FOV SLO (WFSLO) with an AOSLO. This multi-scale approach, implemented with fast tip/tilt mirrors, has a large stabilization range of ± 5.6°. Our method consists of three stages implemented in parallel: 1) coarse optical stabilization driven by a WFSLO image, 2) fine optical stabilization driven by an AOSLO image, and 3) sub-pixel digital registration of the AOSLO image. We evaluated system performance in normal eyes and diseased eyes with poor fixation. Residual image motion with incremental compensation after each stage was: 1) ~2-3 arc minutes, (arcmin) 2) ~0.5-0.8 arcmin and, 3) ~0.05-0.07 arcmin, for normal eyes. Performance in eyes with poor fixation was: 1) ~3-5 arcmin, 2) ~0.7-1.1 arcmin and 3) ~0.07-0.14 arcmin. We demonstrate that this system is capable of reducing image motion by a factor of ~400, on average. This new optical design provides additional benefits for clinical imaging, including a steering subsystem for AOSLO that can be guided by the WFSLO to target specific regions of interest such as retinal pathology and real-time averaging of registered images to eliminate image post-processing. PMID:26114033

  16. Program user's manual for optimizing the design of a liquid or gaseous propellant rocket engine with the automated combustor design code AUTOCOM

    NASA Technical Reports Server (NTRS)

    Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.

    1973-01-01

    This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.

  17. Time domain and frequency domain design techniques for model reference adaptive control systems

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III

    1971-01-01

    Some problems associated with the design of model-reference adaptive control systems are considered and solutions to these problems are advanced. The stability of the adapted system is a primary consideration in the development of both the time-domain and the frequency-domain design techniques. Consequentially, the use of Liapunov's direct method forms an integral part of the derivation of the design procedures. The application of sensitivity coefficients to the design of model-reference adaptive control systems is considered. An application of the design techniques is also presented.

  18. Human factors design of automated highway systems: First generation scenarios. Final report, 1 November 1992-1 May 1993

    SciTech Connect

    Tsao, H.S.J.; Hall, R.W.; Shladover, S.E.; Plocher, T.A.; Levitan, L.J.

    1994-12-01

    Attention to driver acceptance and performance issues during system design will be key to the success of the Automated Highway System (AHS). A first step in the process defining driver roles and driver-system interface requirements for AHS is the definition of system visions and operational scenarios. These scenarios then become the basis for first identifying driver functions and information requirements, and, later, designing the driver`s interface to the AHS. In addition, the scenarios provide a framework within which variables that potentially impact the driver can be explored systematically. Seven AHS operational scenarios, each describing a different AHS vision, were defined by varying three system dimensions with special significance for the driver. These three dimensions are: (1) the degree to which automated and manual traffic is separated, (2) the rules for vehicle following and spacing, and (3) the level of automation in traffic flow control. The seven scenarios vary in the complexity of the automated and manual driving maneuvers required, the physical space allowed for maneuvers, and the nature of the resulting demands placed on the driver. Each scenario describes the physical configuration of the system, operational events from entry to exist, and high-level driver functions.

  19. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers.

    PubMed

    Espah Borujeni, Amin; Mishler, Dennis M; Wang, Jingzhi; Huso, Walker; Salis, Howard M

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription-translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications.

  20. Design and utilization of the drug-excipient chemical compatibility automated system.

    PubMed

    Thomas, V Hayden; Naath, Maryanne

    2008-07-01

    To accelerate clinical formulation development, an excipient compatibility screen should be conducted as early as possible and it must be rapid, robust and resource sparing. This however, does not describe the traditional excipient compatibility testing approach, requiring many tedious and labor intensive manual operations. This study focused on transforming traditional practices into a completely automated screening process to increase sample throughput and realign resources to more urgent areas, while maintaining quality. Using the developed system, a complete on-line performance study was conducted whereby drug-excipient mixtures were weighed, blended and subjected to accelerated stress stability for up to 1 month, followed by sample extraction and HPLC analysis. Compared to off-line traditional study protocols, the system provided similar relative rank order results with equivalent precision and accuracy, while increasing sample throughput. The designed system offers a resource sparing primary screen for drug-excipient chemical compatibility for solid dosage form development. This approach allows risk assessment analysis, based upon formulation complexity, to be conducted prior to the commitment of resources and candidate selection for clinical development. PMID:18486368

  1. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers

    PubMed Central

    Espah Borujeni, Amin; Mishler, Dennis M.; Wang, Jingzhi; Huso, Walker; Salis, Howard M.

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription–translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  2. Accuracy assessment and automation of free energy calculations for drug design.

    PubMed

    Christ, Clara D; Fox, Thomas

    2014-01-27

    As the free energy of binding of a ligand to its target is one of the crucial optimization parameters in drug design, its accurate prediction is highly desirable. In the present study we have assessed the average accuracy of free energy calculations for a total of 92 ligands binding to five different targets. To make this study and future larger scale applications possible we automated the setup procedure. Starting from user defined binding modes, the procedure decides which ligands to connect via a perturbation based on maximum common substructure criteria and produces all necessary parameter files for free energy calculations in AMBER 11. For the systems investigated, errors due to insufficient sampling were found to be substantial in some cases whereas differences in estimators (thermodynamic integration (TI) versus multistate Bennett acceptance ratio (MBAR)) were found to be negligible. Analytical uncertainty estimates calculated from a single free energy calculation were found to be much smaller than the sample standard deviation obtained from two independent free energy calculations. Agreement with experiment was found to be system dependent ranging from excellent to mediocre (RMSE = [0.9, 8.2, 4.7, 5.7, 8.7] kJ/mol). When restricting analyses to free energy calculations with sample standard deviations below 1 kJ/mol agreement with experiment improved (RMSE = [0.8, 6.9, 1.8, 3.9, 5.6] kJ/mol).

  3. A novel automated instrument designed to determine photosensitivity thresholds (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Aguilar, Mariela C.; Gonzalez, Alex; Rowaan, Cornelis; De Freitas, Carolina; Rosa, Potyra R.; Alawa, Karam; Lam, Byron L.; Parel, Jean-Marie A.

    2016-03-01

    As there is no clinically available instrument to systematically and reliably determine the photosensitivity thresholds of patients with dry eyes, blepharospasms, migraines, traumatic brain injuries, and genetic disorders such as Achromatopsia, retinitis pigmentosa and other retinal dysfunctions, a computer-controlled optoelectronics system was designed. The BPEI Photosensitivity System provides a light stimuli emitted from a bi-cupola concave, 210 white LED array with varying intensity ranging from 1 to 32,000 lux. The system can either utilize a normal or an enhanced testing mode for subjects with low light tolerance. The automated instrument adjusts the intensity of each light stimulus. The subject is instructed to indicate discomfort by pressing a hand-held button. Reliability of the responses is tracked during the test. The photosensitivity threshold is then calculated after 10 response reversals. In a preliminary study, we demonstrated that subjects suffering from Achromatopsia experienced lower photosensitivity thresholds than normal subjects. Hence, the system can safely and reliably determine the photosensitivity thresholds of healthy and light sensitive subjects by detecting and quantifying the individual differences. Future studies will be performed with this system to determine the photosensitivity threshold differences between normal subjects and subjects suffering from other conditions that affect light sensitivity.

  4. Automated physics-based design of synthetic riboswitches from diverse RNA aptamers.

    PubMed

    Espah Borujeni, Amin; Mishler, Dennis M; Wang, Jingzhi; Huso, Walker; Salis, Howard M

    2016-01-01

    Riboswitches are shape-changing regulatory RNAs that bind chemicals and regulate gene expression, directly coupling sensing to cellular actuation. However, it remains unclear how their sequence controls the physics of riboswitch switching and activation, particularly when changing the ligand-binding aptamer domain. We report the development of a statistical thermodynamic model that predicts the sequence-structure-function relationship for translation-regulating riboswitches that activate gene expression, characterized inside cells and within cell-free transcription-translation assays. Using the model, we carried out automated computational design of 62 synthetic riboswitches that used six different RNA aptamers to sense diverse chemicals (theophylline, tetramethylrosamine, fluoride, dopamine, thyroxine, 2,4-dinitrotoluene) and activated gene expression by up to 383-fold. The model explains how aptamer structure, ligand affinity, switching free energy and macromolecular crowding collectively control riboswitch activation. Our model-based approach for engineering riboswitches quantitatively confirms several physical mechanisms governing ligand-induced RNA shape-change and enables the development of cell-free and bacterial sensors for diverse applications. PMID:26621913

  5. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  6. An automated classification scheme designed to better elucidate the dependence of ozone on meterology

    SciTech Connect

    Eder, B.K.; Davis, J.M.; Bloomfield, P.

    1994-10-01

    This paper utilizes a two-stage (average linkage then convergent k means) clustering approach as part of an automated meterological classification scheme designed to better elucidate the dependence of ozone on meterology. When applied to 10 years (1981-90) of meterological data for Birmingham, Alabama, the classification scheme identified seven statistically distinct meteorological regimes, the majority of which exhibited significantly different daily 1-h maximum ozone concentration distributions. Results from this two-stage clustering approach were then used to develop seven {open_quotes}refined{close_quotes} stepwise regression models designed to (1) identify the optimum set of independent meteorological parameters influencing the O{sub 3} concentrations within each meteorological cluster, and (2) weigh each independent parameter according to its unique influence within that cluster. Large differences were noted in the number, order, and selection of independent variables found to significantly contribute ({alpha} = 0.10) to the variability of O{sub 3}. When this unique dependence was taken into consideration through the development and subsequent amalgamation of the seven individual regression models, a better parameterization of O{sub 3}`s dependence on meteorology was achieved. This {open_quotes}composite{close_quotes} model exhibited a significantly larger R{sup 2} (0.59) and a smaller rmse (12.80 ppb) when compared to results achieved from an {open_quotes}overall{close_quotes} model (R{sup 2} = 0.53, rmse = 13.85) in which the meterological data were not clustered. 52 refs., 15 figs., 9 tabs.

  7. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  8. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    PubMed Central

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  9. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    SciTech Connect

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  10. Adapting Dam and Reservoir Design and Operations to Climate Change

    NASA Astrophysics Data System (ADS)

    Roy, René; Braun, Marco; Chaumont, Diane

    2013-04-01

    In order to identify the potential initiatives that the dam, reservoir and water resources systems owners and operators may undertake to cope with climate change issues, it is essential to determine the current state of knowledge of their impacts on hydrological variables at regional and local scales. Future climate scenarios derived from climate model simulations can be combined with operational hydrological modeling tools and historical observations to evaluate realistic pathways of future hydrological conditions for specific drainage basins. In the case of hydropower production those changes in hydrological conditions may have significant economic impacts. For over a decade the state owned hydropower producer Hydro Québec has been exploring the physical impacts on their watersheds by relying on climate services in collaboration with Ouranos, a consortium on regional climatology and adaptation to climate change. Previous climate change impact analysis had been including different sources of climate simulation data, explored different post-processing approaches and used hydrological impact models. At a new stage of this collaboration the operational management of Hydro Quebec aspired to carry out a cost-benefit analysis of considering climate change in the refactoring of hydro-power installations. In the process of the project not only a set of scenarios of future runoff regimes had to be defined to support long term planning decisions of a dam and reservoir operator, but also the significance of uncertainties needed to be communicated and made understood. We provide insight into a case study that took some unexpected turns and leaps by bringing together climate scientists, hydrologists and hydro-power operation managers. The study includes the selection of appropriate climate scenarios, the correction of biases, the application of hydrological models and the assessment of uncertainties. However, it turned out that communicating the science properly and

  11. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    NASA Astrophysics Data System (ADS)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  12. Issues and Challenges in the Design of Culturally Adapted Evidence-Based Interventions

    PubMed Central

    Castro, Felipe González; Barrera, Manuel; Holleran Steiker, Lori K.

    2014-01-01

    This article examines issues and challenges in the design of cultural adaptations that are developed from an original evidence-based intervention (EBI). Recently emerging multistep frameworks or stage models are examined, as these can systematically guide the development of culturally adapted EBIs. Critical issues are also presented regarding whether and how such adaptations may be conducted, and empirical evidence is presented regarding the effectiveness of such cultural adaptations. Recent evidence suggests that these cultural adaptations are effective when applied with certain subcultural groups, although they are less effective when applied with other subcultural groups. Generally, current evidence regarding the effectiveness of cultural adaptations is promising but mixed. Further research is needed to obtain more definitive conclusions regarding the efficacy and effectiveness of culturally adapted EBIs. Directions for future research and recommendations are presented to guide the development of a new generation of culturally adapted EBIs. PMID:20192800

  13. Passive versus active operator work in automated process control--a job design case study in a control centre.

    PubMed

    Persson, A; Wanek, B; Johansson, A

    2001-10-01

    Methods of avoiding common problems associated with operator work in automated process control, such as understimulation and difficulties in achieving and maintaining necessary skills and competence, are addressed in this paper. The source of these problems is deduced here to be that monitoring tasks are a predominant part of the job. This case study shows how work in a highly automated process can be designed not only to avoid the traditional problems, but also provide a stimulating job within a good work situation at the same time as fulfilling efficiency demands. A new definition of active/passive operator jobs is made which is based on a categorisation of the types of work tasks that make up the job. The definition gives an explanation of how different designs of operator jobs result in more or less active/passive work situations.

  14. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  15. An Automatic Online Calibration Design in Adaptive Testing

    ERIC Educational Resources Information Center

    Makransky, Guido; Glas, Cees A. W.

    2010-01-01

    An accurately calibrated item bank is essential for a valid computerized adaptive test. However, in some settings, such as occupational testing, there is limited access to test takers for calibration. As a result of the limited access to possible test takers, collecting data to accurately calibrate an item bank in an occupational setting is…

  16. Automated registration of laser Doppler perfusion images by an adaptive correlation approach: application to focal cerebral ischemia in the rat.

    PubMed

    Riyamongkol, Panomkhawn; Zhao, Weizhao; Liu, Yitao; Belayev, Ludmila; Busto, Raul; Ginsberg, Myron D

    2002-12-31

    Hemodynamic changes are extremely important in analyzing responses from a brain subjected to a stimulus or treatment. The Laser Doppler technique has emerged as an important tool in neuroscience research. This non-invasive method scans a low-power laser beam in a raster pattern over a tissue surface to generate the time course of images in unit of relative flux changes. Laser Doppler imager (LDI) records cerebral perfusion not only in the temporal but also in the spatial domain. The traditional analysis of LD images has been focused on the region-of-interest (ROI) approach, in which the analytical accuracy in an experiment that necessitates a relative repositioning between the LDI and the scanned tissue area will be weakened due to the operator's subjective decision in data collecting. This report describes a robust image registration method designed to obviate this problem, which is based on the adaptive correlation approach. The assumption in mapping corresponding pixels in two images is to correlate the regions in which these pixels are centered. Based on this assumption, correlation coefficients are calculated between two regions by a method in which one region is moved around over the other in all possible combinations. To avoid ambiguity in distinguishing maximum correlation coefficients, an adaptive algorithm is adopted. Correspondences are then used to estimate the transformation by linear regression. We used a pair of phantom LD images to test this algorithm. A reliability test was also performed on each of the 15 sequential LD images derived from an actual experiment by imposing rotation and translation. The result shows that the calculated transformation parameters (rotation: theta =7.7+/-0.5 degrees; translation: Delta x =2.8+/-0.3, Deltaŷ=4.7+/-0.4) are very close to the prior-set parameters (rotation: theta=8 degrees; translation: Delta x=3, Delta y=5). This result indicates that this approach is a valuable adjunct to LD perfusion monitoring. An

  17. Advanced Air Traffic Management Research (Human Factors and Automation): NASA Research Initiatives in Human-Centered Automation Design in Airspace Management

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Condon, Gregory W. (Technical Monitor)

    1996-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. The core processes of control and the distribution of decision making in that control are undergoing extensive analysis. From our perspective, the human operators and the procedures by which they interact are the fundamental determinants of the safe, efficient, and flexible operation of the system. In that perspective, we have begun to explore what our experience has taught will be the most challenging aspects of designing and integrating human-centered automation in the advanced system. We have performed a full mission simulation looking at the role shift to self-separation on board the aircraft with the rules of the air guiding behavior and the provision of a cockpit display of traffic information and an on-board traffic alert system that seamlessly integrates into the TCAS operations. We have performed and initial investigation of the operational impact of "Dynamic Density" metrics on controller relinquishing and reestablishing full separation authority. (We follow the assumption that responsibility at all times resides with the controller.) This presentation will describe those efforts as well as describe the process by which we will guide the development of error tolerant systems that are sensitive to shifts in operator work load levels and dynamic shifts in the operating point of air traffic management.

  18. Context-Adaptive Learning Designs by Using Semantic Web Services

    ERIC Educational Resources Information Center

    Dietze, Stefan; Gugliotta, Alessio; Domingue, John

    2007-01-01

    IMS Learning Design (IMS-LD) is a promising technology aimed at supporting learning processes. IMS-LD packages contain the learning process metadata as well as the learning resources. However, the allocation of resources--whether data or services--within the learning design is done manually at design-time on the basis of the subjective appraisals…

  19. Design and implementation of adaptive PI control schemes for web tension control in roll-to-roll (R2R) manufacturing.

    PubMed

    Raul, Pramod R; Pagilla, Prabhakar R

    2015-05-01

    In this paper, two adaptive Proportional-Integral (PI) control schemes are designed and discussed for control of web tension in Roll-to-Roll (R2R) manufacturing systems. R2R systems are used to transport continuous materials (called webs) on rollers from the unwind roll to the rewind roll. Maintaining web tension at the desired value is critical to many R2R processes such as printing, coating, lamination, etc. Existing fixed gain PI tension control schemes currently used in industrial practice require extensive tuning and do not provide the desired performance for changing operating conditions and material properties. The first adaptive PI scheme utilizes the model reference approach where the controller gains are estimated based on matching of the actual closed-loop tension control systems with an appropriately chosen reference model. The second adaptive PI scheme utilizes the indirect adaptive control approach together with relay feedback technique to automatically initialize the adaptive PI gains. These adaptive tension control schemes can be implemented on any R2R manufacturing system. The key features of the two adaptive schemes is that their designs are simple for practicing engineers, easy to implement in real-time, and automate the tuning process. Extensive experiments are conducted on a large experimental R2R machine which mimics many features of an industrial R2R machine. These experiments include trials with two different polymer webs and a variety of operating conditions. Implementation guidelines are provided for both adaptive schemes. Experimental results comparing the two adaptive schemes and a fixed gain PI tension control scheme used in industrial practice are provided and discussed.

  20. A mathematical basis for the design and design optimization of adaptive trusses in precision control

    NASA Technical Reports Server (NTRS)

    Das, S. K.; Utku, S.; Chen, G.-S.; Wada, B. K.

    1991-01-01

    A mathematical basis for the optimal design of adaptive trusses to be used in supporting precision equipment is provided. The general theory of adaptive structures is introduced, and the global optimization problem of placing a limited number, q, of actuators, so as to maximally achieve precision control and provide prestress, is stated. Two serialized optimization problems, namely, optimal actuator placement for prestress and optimal actuator placement for precision control, are addressed. In the case of prestressing, the computation of a 'desired' prestress is discussed, the interaction between actuators and redundants in conveying the prestress is shown in its mathematical form, and a methodology for arriving at the optimal placement of actuators and additional redundants is discussed. With regard to precision control, an optimal placement scheme (for q actuators) for maximum 'authority' over the precision points is suggested. The results of the two serialized optimization problems are combined to give a suboptimal solution to the global optimization problem. A method for improving this suboptimal actuator placement scheme by iteration is presented.

  1. Decentralized adaptive control of robot manipulators with robust stabilization design

    NASA Technical Reports Server (NTRS)

    Yuan, Bau-San; Book, Wayne J.

    1988-01-01

    Due to geometric nonlinearities and complex dynamics, a decentralized technique for adaptive control for multilink robot arms is attractive. Lyapunov-function theory for stability analysis provides an approach to robust stabilization. Each joint of the arm is treated as a component subsystem. The adaptive controller is made locally stable with servo signals including proportional and integral gains. This results in the bound on the dynamical interactions with other subsystems. A nonlinear controller which stabilizes the system with uniform boundedness is used to improve the robustness properties of the overall system. As a result, the robot tracks the reference trajectories with convergence. This strategy makes computation simple and therefore facilitates real-time implementation.

  2. Biomimetic rules for design of complex adaptive structures

    NASA Astrophysics Data System (ADS)

    Dry, Carolyn M.

    2001-10-01

    Nature builds by 1) use of local, inexpensive, available often recycled materials which 2) are self-ordering or growing by attributes shared between the material and environment, 3) repair themselves, 4) sense and adapt to changes in the environment daily, seasonally, and yearly; 5) easily disintegrate and recycle back into the material sink when their usefulness is at an end; and 6) do not harm the environment, but perhaps enhance it or resolve problems.

  3. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.

    PubMed

    Vasdev, Neil; Collier, Thomas Lee

    2016-01-01

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer. PMID:27548189

  4. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis

    PubMed Central

    Vasdev, Neil; Collier, Thomas Lee

    2016-01-01

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer. PMID:27548189

  5. On efficient two-stage adaptive designs for clinical trials with sample size adjustment.

    PubMed

    Liu, Qing; Li, Gang; Anderson, Keaven M; Lim, Pilar

    2012-01-01

    Group sequential designs are rarely used for clinical trials with substantial over running due to fast enrollment or long duration of treatment and follow-up. Traditionally, such trials rely on fixed sample size designs. Recently, various two-stage adaptive designs have been introduced to allow sample size adjustment to increase statistical power or avoid unnecessarily large trials. However, these adaptive designs can be seriously inefficient. To address this infamous problem, we propose a likelihood-based two-stage adaptive design where sample size adjustment is derived from a pseudo group sequential design using cumulative conditional power. We show through numerical examples that this design cannot be improved by group sequential designs. In addition, the approach may uniformly improve any existing two-stage adaptive designs with sample size adjustment. For statistical inference, we provide methods for sequential p-values and confidence intervals, as well as median unbiased and minimum variance unbiased estimates. We show that the claim of inefficiency of adaptive designs by Tsiatis and Mehta ( 2003 ) is logically flawed, and thereby provide a strong defense of Cui et al. ( 1999 ). PMID:22651105

  6. Design and development of an automated flow injection instrument for the determination of arsenic species in natural waters

    PubMed Central

    Hanrahan, Grady; Fan, Tina K.; Kantor, Melanie; Clark, Keith; Cardenas, Steven; Guillaume, Darrell W.; Khachikian, Crist S.

    2009-01-01

    The design and development of an automated flow injection instrument for the determination of arsenite [As(III)] and arsenate [As(V)] in natural waters is described. The instrument incorporates solenoid activated self-priming micropumps and electronic switching valves for controlling the fluidics of the system and a miniature charge-coupled device spectrometer operating in a graphical programming environment. The limits of detection were found to be 0.79 and 0.98 μM for As(III) and As(V), respectively, with linear range of 1–50 μM. Spiked ultrapure water samples were analyzed and recoveries were found to be 97%–101% for As(III) and 95%–99% for As(V), respectively. Future directions in terms of automation, optimization, and field deployment are discussed. PMID:19895074

  7. A Web-Based Adaptive Tutor to Teach PCR Primer Design

    ERIC Educational Resources Information Center

    van Seters, Janneke R.; Wellink, Joan; Tramper, Johannes; Goedhart, Martin J.; Ossevoort, Miriam A.

    2012-01-01

    When students have varying prior knowledge, personalized instruction is desirable. One way to personalize instruction is by using adaptive e-learning to offer training of varying complexity. In this study, we developed a web-based adaptive tutor to teach PCR primer design: the PCR Tutor. We used part of the Taxonomy of Educational Objectives (the…

  8. The Design and the Formative Evaluation of an Adaptive Educational System Based on Cognitive Styles

    ERIC Educational Resources Information Center

    Triantafillou, Evangelos; Pomportsis, Andreas; Demetriadis, Stavros

    2003-01-01

    Adaptive Hypermedia Systems (AHS) can be developed to accommodate a variety of individual differences, including learning style and cognitive style. The current research is an attempt to examine some of the critical variables, which may be important in the design of an Adaptive Educational System (AES) based on student's cognitive style. Moreover,…

  9. Design of fuzzy system by NNs and realization of adaptability

    NASA Technical Reports Server (NTRS)

    Takagi, Hideyuki

    1993-01-01

    The issue of designing and tuning fuzzy membership functions by neural networks (NN's) was started by NN-driven Fuzzy Reasoning in 1988. NN-driven fuzzy reasoning involves a NN embedded in the fuzzy system which generates membership values. In conventional fuzzy system design, the membership functions are hand-crafted by trial and error for each input variable. In contrast, NN-driven fuzzy reasoning considers several variables simultaneously and can design a multidimensional, nonlinear membership function for the entire subspace.

  10. Response-adaptive decision-theoretic trial design: operating characteristics and ethics.

    PubMed

    Lipsky, Ari M; Lewis, Roger J

    2013-09-20

    Adaptive randomization is used in clinical trials to increase statistical efficiency. In addition, some clinicians and researchers believe that using adaptive randomization leads necessarily to more ethical treatment of subjects in a trial. We develop Bayesian, decision-theoretic, clinical trial designs with response-adaptive randomization and a primary goal of estimating treatment effect and then contrast these designs with designs that also include in their loss function a cost for poor subject outcome. When the loss function did not incorporate a cost for poor subject outcome, the gains in efficiency from response-adaptive randomization were accompanied by ethically concerning subject allocations. Conversely, including a cost for poor subject outcome demonstrated a more acceptable balance between the competing needs in the trial. A subsequent, parallel set of trials designed to control explicitly types I and II error rates showed that much of the improvement achieved through modification of the loss function was essentially negated. Therefore, gains in efficiency from the use of a decision-theoretic, response-adaptive design using adaptive randomization may only be assumed to apply to those goals that are explicitly included in the loss function. Trial goals, including ethical ones, which do not appear in the loss function, are ignored and may even be compromised; it is thus inappropriate to assume that all adaptive trials are necessarily more ethical. Controlling types I and II error rates largely negates the benefit of including competing needs in favor of the goal of parameter estimation.

  11. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  12. Design of an Automated Essay Grading (AEG) System in Indian Context

    ERIC Educational Resources Information Center

    Ghosh, Siddhartha; Fatima, Sameen S.

    2007-01-01

    Automated essay grading or scoring systems are no more a myth, but they are a reality. As of today, the human written (not hand written) essays are corrected not only by examiners/teachers but also by machines. The TOEFL exam is one of the best examples of this application. The students' essays are evaluated both by human and web based automated…

  13. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer.

    PubMed

    Bondar, M Luiza; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  14. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer

    NASA Astrophysics Data System (ADS)

    Luiza Bondar, M.; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  15. Adapting Cognitive Walkthrough to Support Game Based Learning Design

    ERIC Educational Resources Information Center

    Farrell, David; Moffat, David C.

    2014-01-01

    For any given Game Based Learning (GBL) project to be successful, the player must learn something. Designers may base their work on pedagogical research, but actual game design is still largely driven by intuition. People are famously poor at unsupported methodical thinking and relying so much on instinct is an obvious weak point in GBL design…

  16. Adapting the Mathematical Task Framework to Design Online Didactic Objects

    ERIC Educational Resources Information Center

    Bowers, Janet; Bezuk, Nadine; Aguilar, Karen

    2011-01-01

    Designing didactic objects involves imagining how students can conceive of specific mathematical topics and then imagining what types of classroom discussions could support these mental constructions. This study investigated whether it was possible to design Java applets that might serve as didactic objects to support online learning where…

  17. ADAPT[IT]: Tools for Training Design and Evaluation.

    ERIC Educational Resources Information Center

    de Croock, Marcel B. M.; Paas, Fred; Schlanbusch, Henrik; van Merrienboer, Jeroen J. G.

    2002-01-01

    Describes a set of computerized tools that support the instructional design and evaluation of competency-based training programs. Discusses authentic whole-task practice situations; the 4C/ID* methodology; an evaluation tool that supports the subsequent revision of the competency-based training design; and future plans. (Author/LRW)

  18. Adapting the mathematical task framework to design online didactic objects

    NASA Astrophysics Data System (ADS)

    Bowers, Janet; Bezuk, Nadine; Aguilar, Karen

    2011-06-01

    Designing didactic objects involves imagining how students can conceive of specific mathematical topics and then imagining what types of classroom discussions could support these mental constructions. This study investigated whether it was possible to design Java applets that might serve as didactic objects to support online learning where 'discussions' are broadly defined as the conversations students have with themselves as they interact with the dynamic mathematical representations on the screen. Eighty-four pre-service elementary teachers enrolled in hybrid mathematics courses were asked to interact with a series of applets designed to support their understanding of qualitative graphing. The results of the surveys indicate that various design features of the applets did in fact cause perturbations and opportunities for resolutions that enabled the users to 'discuss' their learning by reflecting on their in-class discussions and online activities. The discussion includes four design features for guiding future applet creation.

  19. Equipment development for automated assembly of solar modules

    NASA Technical Reports Server (NTRS)

    Hagerty, J. J.

    1982-01-01

    Prototype equipment was developed which allows for totally automated assembly in the three major areas of module manufacture: cell stringing, encapsulant layup and cure and edge sealing. The equipment is designed to be used in conjunction with a standard Unimate 2000B industrial robot although the design is adaptable to other transport systems.

  20. Designing Forest Adaptation Experiments through Manager-Scientist Partnerships

    NASA Astrophysics Data System (ADS)

    Nagel, L. M.; Swanston, C.; Janowiak, M.

    2014-12-01

    Three common forest adaptation options discussed in the context of an uncertain future climate are: creating resistance, promoting resilience, and enabling forests to respond to change. Though there is consensus on the broad management goals addressed by each of these options, translating these concepts into management plans specific for individual forest types that vary in structure, composition, and function remains a challenge. We will describe a decision-making framework that we employed within a manager-scientist partnership to develop a suite of adaptation treatments for two contrasting forest types as part of a long-term forest management experiment. The first, in northern Minnesota, is a red pine-dominated forest with components of white pine, aspen, paper birch, and northern red oak, with a hazel understory. The second, in southwest Colorado, is a warm-dry mixed conifer forest dominated by ponderosa pine, white fir, and Douglas-fir, with scattered aspen and an understory of Gambel oak. The current conditions at both sites are characterized by overstocking with moderate-to-high fuel loading, vulnerability to numerous forest health threats, and are generally uncharacteristic of historic structure and composition. The desired future condition articulated by managers for each site included elements of historic structure and natural range of variability, but were greatly tempered by known vulnerabilities and projected changes to climate and disturbance patterns. The resultant range of treatments we developed are distinct for each forest type, and address a wide range of management objectives.

  1. Adaptive optoelectronic camouflage systems with designs inspired by cephalopod skins

    PubMed Central

    Yu, Cunjiang; Li, Yuhang; Zhang, Xun; Huang, Xian; Malyarchuk, Viktor; Wang, Shuodao; Shi, Yan; Gao, Li; Su, Yewang; Zhang, Yihui; Xu, Hangxun; Hanlon, Roger T.; Huang, Yonggang; Rogers, John A.

    2014-01-01

    Octopus, squid, cuttlefish, and other cephalopods exhibit exceptional capabilities for visually adapting to or differentiating from the coloration and texture of their surroundings, for the purpose of concealment, communication, predation, and reproduction. Long-standing interest in and emerging understanding of the underlying ultrastructure, physiological control, and photonic interactions has recently led to efforts in the construction of artificial systems that have key attributes found in the skins of these organisms. Despite several promising options in active materials for mimicking biological color tuning, existing routes to integrated systems do not include critical capabilities in distributed sensing and actuation. Research described here represents progress in this direction, demonstrated through the construction, experimental study, and computational modeling of materials, device elements, and integration schemes for cephalopod-inspired flexible sheets that can autonomously sense and adapt to the coloration of their surroundings. These systems combine high-performance, multiplexed arrays of actuators and photodetectors in laminated, multilayer configurations on flexible substrates, with overlaid arrangements of pixelated, color-changing elements. The concepts provide realistic routes to thin sheets that can be conformally wrapped onto solid objects to modulate their visual appearance, with potential relevance to consumer, industrial, and military applications. PMID:25136094

  2. Design of smart composite platforms for adaptive trust vector control and adaptive laser telescope for satellite applications

    NASA Astrophysics Data System (ADS)

    Ghasemi-Nejhad, Mehrdad N.

    2013-04-01

    This paper presents design of smart composite platforms for adaptive trust vector control (TVC) and adaptive laser telescope for satellite applications. To eliminate disturbances, the proposed adaptive TVC and telescope systems will be mounted on two analogous smart composite platform with simultaneous precision positioning (pointing) and vibration suppression (stabilizing), SPPVS, with micro-radian pointing resolution, and then mounted on a satellite in two different locations. The adaptive TVC system provides SPPVS with large tip-tilt to potentially eliminate the gimbals systems. The smart composite telescope will be mounted on a smart composite platform with SPPVS and then mounted on a satellite. The laser communication is intended for the Geosynchronous orbit. The high degree of directionality increases the security of the laser communication signal (as opposed to a diffused RF signal), but also requires sophisticated subsystems for transmission and acquisition. The shorter wavelength of the optical spectrum increases the data transmission rates, but laser systems require large amounts of power, which increases the mass and complexity of the supporting systems. In addition, the laser communication on the Geosynchronous orbit requires an accurate platform with SPPVS capabilities. Therefore, this work also addresses the design of an active composite platform to be used to simultaneously point and stabilize an intersatellite laser communication telescope with micro-radian pointing resolution. The telescope is a Cassegrain receiver that employs two mirrors, one convex (primary) and the other concave (secondary). The distance, as well as the horizontal and axial alignment of the mirrors, must be precisely maintained or else the optical properties of the system will be severely degraded. The alignment will also have to be maintained during thruster firings, which will require vibration suppression capabilities of the system as well. The innovative platform has been

  3. Adaption of a fragment analysis technique to an automated high-throughput multicapillary electrophoresis device for the precise qualitative and quantitative characterization of microbial communities.

    PubMed

    Trotha, René; Reichl, Udo; Thies, Frank L; Sperling, Danuta; König, Wolfgang; König, Brigitte

    2002-04-01

    The analysis of microbial communities is of increasing importance in life sciences and bioengineering. Traditional techniques of investigations like culture or cloning methods suffer from many disadvantages. They are unable to give a complete qualitative and quantitative view of the total amount of microorganisms themselves, their interactions among each other and with their environment. Obviously, the determination of static or dynamic balances among microorganisms is of fast growing interest. The generation of species specific and fluorescently labeled 16S ribosomal DNA (rDNA) fragments by the terminal restriction fragment length polymorphism (T-RFLP) technique is a suitable tool to overcome the problems other methods have. For the separation of these fragments polyacrylamide gel sequencers are preferred as compared to capillary sequencers using linear polymers until now because of their higher electrophoretic resolution and therefore sizing accuracy. But modern capillary sequencers, especially multicapillary sequencers, offer an advanced grade of automation and an increased throughput necessary for the investigation of complex communities in long-time studies. Therefore, we adapted a T-RFLP technique to an automated high-throughput multicapillary electrophoresis device (ABI 3100 Genetic Analysis) with regard to a precise qualitative and quantitative characterization of microbial communities. PMID:11981854

  4. Adaptive vessel tracking: automated computation of vessel trajectories for improved efficiency in 2D coronary MR angiography.

    PubMed

    Saranathan, M; Ho, V B; Hood, M N; Foo, T K; Hardy, C J

    2001-10-01

    A new method was investigated for improving the efficiency of ECG-gated coronary magnetic resonance angiography (CMRA) by accurate, automated tracking of the vessel motion over the cardiac cycle. Vessel tracking was implemented on a spiral gradient-echo pulse sequence with sub-millimeter in-plane spatial resolution as well as high image signal to noise ratio. Breath hold 2D CMRA was performed in 18 healthy adult subjects (mean age 46 +/- 14 years). Imaging efficiency, defined as the percentage of the slices where more than 30 mm of the vessel is visualized, was computed in multi-slice spiral scans with and without vessel tracking. There was a significant improvement in the efficiency of the vessel tracking sequence compared to the multi-slice sequence (56% vs. 32%, P < 0.001). The imaging efficiency increased further when the true motion of the coronary arteries (determined using a cross correlation algorithm) was used for vessel tracking as opposed to a linear model for motion (71% vs. 57%, P < 0.05). The motion of the coronary arteries was generally found to be linear during the systolic phase and nonlinear during the diastolic phase. The use of subject-tailored, automated tracking of vessel positions resulted in improved efficiency of coronary artery illustration on breath held 2D CMRA.

  5. Water Infrastructure Adaptation in New Urban Design: Possibilities and Constraints

    EPA Science Inventory

    Natural constraints, including climate change and dynamic socioeconomic development, can significantly impact the way we plan, design, and operate water infrastructure, thus its sustainability to deliver reliable quality water supplies and comply with environmental regulations. ...

  6. Automated Multiple-Sample Tray Manipulation Designed and Fabricated for Atomic Oxygen Facility

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Stueber, Thomas J.; Dever, Joyce A.; Banks, Bruce A.; Rutledge, Sharon K.

    2000-01-01

    Extensive improvements to increase testing capacity and flexibility and to automate the in situ Reflectance Measurement System (RMS) are in progress at the Electro-Physics Branch s Atomic Oxygen (AO) beam facility of the NASA Glenn Research Center at Lewis Field. These improvements will triple the system s capacity while placing a significant portion of the testing cycle under computer control for added reliability, repeatability, and ease of use.

  7. An expert system for choosing the best combination of options in a general-purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.; Barthelemy, J. F. M.

    1985-01-01

    An expert system was developed to aid a user of the Automated Design Synthesis (ADS) general-purpose optimization computer program in selecting the best combination of strategy, optimizer, and one-dimensional search options for solving a problem. There are approximately 100 such combinations available in ADS. The knowledge base contains over 200 rules, and is divided into three categories: constrained problems, unconstrained problems, and constrained problems treated as unconstrained problems. The inference engine is written in LISP and is available on DEC-VAX and IBM PC/XT computers.

  8. Design and development of an automated and non-contact sensing system for continuous monitoring of plant health and growth.

    PubMed

    Kacira, M; Ling, P P

    2001-01-01

    An automated system was designed and built to continuously monitor plant health and growth in a controlled environment using a distributed system approach for operational control and data collection. The computer-controlled system consisted of a motorized turntable to present the plants to the stationary sensors and reduce microclimate variability among the plants. Major sensing capabilities of the system included machine vision, infrared thermometry, time domain reflectometry, and micro-lysimeters. The system also maintained precise growth-medium moisture levels through a computer-controlled drip irrigation system. The system was capable of collecting required data continuously to monitor and to evaluate the plant health and growth. PMID:12026934

  9. Implementation of an Automated Grading System with an Adaptive Learning Component to Affect Student Feedback and Response Time

    ERIC Educational Resources Information Center

    Matthews, Kevin; Janicki, Thomas; He, Ling; Patterson, Laurie

    2012-01-01

    This research focuses on the development and implementation of an adaptive learning and grading system with a goal to increase the effectiveness and quality of feedback to students. By utilizing various concepts from established learning theories, the goal of this research is to improve the quantity, quality, and speed of feedback as it pertains…

  10. Context-Aware Design for Process Flexibility and Adaptation

    ERIC Educational Resources Information Center

    Yao, Wen

    2012-01-01

    Today's organizations face continuous and unprecedented changes in their business environment. Traditional process design tools tend to be inflexible and can only support rigidly defined processes (e.g., order processing in the supply chain). This considerably restricts their real-world applications value, especially in the dynamic and…

  11. Adapting Wood Technology to Teach Design and Engineering

    ERIC Educational Resources Information Center

    Rummel, Robert A.

    2012-01-01

    Technology education has changed dramatically over the last few years. The transition of industrial arts to technology education and more recently the pursuit of design and engineering has resulted in technology education teachers often needing to change their curriculum and course activities to meet the demands of a rapidly changing profession.…

  12. Optimal adaptive two-stage designs for early phase II clinical trials.

    PubMed

    Shan, Guogen; Wilding, Gregory E; Hutson, Alan D; Gerstenberger, Shawn

    2016-04-15

    Simon's optimal two-stage design has been widely used in early phase clinical trials for Oncology and AIDS studies with binary endpoints. With this approach, the second-stage sample size is fixed when the trial passes the first stage with sufficient activity. Adaptive designs, such as those due to Banerjee and Tsiatis (2006) and Englert and Kieser (2013), are flexible in the sense that the second-stage sample size depends on the response from the first stage, and these designs are often seen to reduce the expected sample size under the null hypothesis as compared with Simon's approach. An unappealing trait of the existing designs is that they are not associated with a second-stage sample size, which is a non-increasing function of the first-stage response rate. In this paper, an efficient intelligent process, the branch-and-bound algorithm, is used in extensively searching for the optimal adaptive design with the smallest expected sample size under the null, while the type I and II error rates are maintained and the aforementioned monotonicity characteristic is respected. The proposed optimal design is observed to have smaller expected sample sizes compared to Simon's optimal design, and the maximum total sample size of the proposed adaptive design is very close to that from Simon's method. The proposed optimal adaptive two-stage design is recommended for use in practice to improve the flexibility and efficiency of early phase therapeutic development. PMID:26526165

  13. A modified varying-stage adaptive phase II/III clinical trial design.

    PubMed

    Dong, Gaohong; Vandemeulebroecke, Marc

    2016-07-01

    Conventionally, adaptive phase II/III clinical trials are carried out with a strict two-stage design. Recently, a varying-stage adaptive phase II/III clinical trial design has been developed. In this design, following the first stage, an intermediate stage can be adaptively added to obtain more data, so that a more informative decision can be made. Therefore, the number of further investigational stages is determined based upon data accumulated to the interim analysis. This design considers two plausible study endpoints, with one of them initially designated as the primary endpoint. Based on interim results, another endpoint can be switched as the primary endpoint. However, in many therapeutic areas, the primary study endpoint is well established. Therefore, we modify this design to consider one study endpoint only so that it may be more readily applicable in real clinical trial designs. Our simulations show that, the same as the original design, this modified design controls the Type I error rate, and the design parameters such as the threshold probability for the two-stage setting and the alpha allocation ratio in the two-stage setting versus the three-stage setting have a great impact on the design characteristics. However, this modified design requires a larger sample size for the initial stage, and the probability of futility becomes much higher when the threshold probability for the two-stage setting gets smaller. Copyright © 2016 John Wiley & Sons, Ltd.

  14. On Adaptive Extended Compatibility Changing Type of Product Design Strategy

    NASA Astrophysics Data System (ADS)

    Wenwen, Jiang; Zhibin, Xie

    The article uses research ways of Enterprise localization and enterprise's development course to research strategy of company's product design and development. It announces at different stages for development, different kinds of enterprises will adopt product design and development policies of different modes. It also announces close causality between development course of company and central technology and product. The result indicated enterprises in leading position in market, technology and brand adopt pioneer strategy type of product research and development. These enterprise relying on the large-scale leading enterprise offering a complete set service adopts the passively duplicating type tactic of product research and development. Some enterprise in part of advantage in technology, market, management or brand adopt following up strategy of product research and development. The enterprises with relative advantage position adopt the strategy of technology applied taking optimizing services as centre in product research and development in fields of brand culture and market service.

  15. Design of the Subaru laser guide star adaptive optics module

    NASA Astrophysics Data System (ADS)

    Watanabe, Makoto; Takami, Hideki; Takato, Naruhisa; Colley, Stephen; Eldred, Michael; Kane, Thomas; Guyon, Olivier; Hattori, Masayuki; Goto, Miwa; Iye, Masanori; Hayano, Yutaka; Kamata, Yukiko; Arimoto, Nobuo; Kobayashi, Naoto; Minowa, Yosuke

    2004-10-01

    The laser guide star adaptive optics (AO) module for the Subaru Telescope will be installed at the f/13.9 IR Nasmyth focus, and provides the compensated image for the science instrument without change of the focal ratio. The optical components are mounted on an optical bench, and the flexure depending on the telescope pointing is eliminated. The transferred field of view for the science instrument is 2 arcmin diameter, but a 2.7 arcmin diameter field is available for tip-tilt sensing. The science path of the AO module contains five mirrors, including a pair of off-axis parabolic mirrors and a deformable mirror. It has also three additional mirrors for an image rotator. The AO module has a visible 188-element curvature based wavefront sensor (WFS) with photon-counting avalanche photodiode (APD) modules. It measures high-order terms of wavefront using either of a single laser (LGS) or natural guide star (NGS) within a 2 arcmin diameter field. The AO module has also a visible 2 x 2 sub-aperture Shack-Hartmann WFS with 16 APD modules. It measures tip-tilt and slow defocus terms of wavefront by using a single NGS within a 2.7 arcmin diameter field when a LGS is used for high-order wavefront sensing. The module has also an infrared 2 x 2 sub-aperture Shack-Hartmann WFS with a HgCdTe array as an option. Both high- and low-order visible WFSs have their own guide star acquisition units with two steering fold mirrors. The AO module has also a source simulator. It simulates LGS and NGS beams, simultaneously, with and without atmospheric turbulence by two turbulent layer at about 0 and 6 km altitudes, and reproduces the isoplanatism and the cone effect for the LGS beam.

  16. A new approach for designing self-organizing systems and application to adaptive control

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Zhang, Shi; Lin, Yueqing; Huang, Song

    1993-01-01

    There is tremendous interest in the design of intelligent machines capable of autonomous learning and skillful performance under complex environments. A major task in designing such systems is to make the system plastic and adaptive when presented with new and useful information and stable in response to irrelevant events. A great body of knowledge, based on neuro-physiological concepts, has evolved as a possible solution to this problem. Adaptive resonance theory (ART) is a classical example under this category. The system dynamics of an ART network is described by a set of differential equations with nonlinear functions. An approach for designing self-organizing networks characterized by nonlinear differential equations is proposed.

  17. Group-Work in the Design of Complex Adaptive Learning Strategies

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    This paper presents a case study where twelve graduate students undertook the demanding role of the adaptive e-course developer and worked collaboratively on an authentic and complex design task in the context of open and distance tertiary education. The students had to work in groups in order to conceptualise and design a learning scenario for…

  18. Impact of New Designs for the Comprehensive High School: Evidence from Two Early Adapters.

    ERIC Educational Resources Information Center

    Copa, George H.

    The New Designs for the Comprehensive High School (NDCHS) project was conducted to develop design processes and specifications for developing new comprehensive high schools, and for restructuring existing schools in accordance with the comprehensive high school model. The project's impact on student learning at two early adapters of the…

  19. A Framework for Adaptive Learning Design in a Web-Conferencing Environment

    ERIC Educational Resources Information Center

    Bower, Matt

    2016-01-01

    Many recent technologies provide the ability to dynamically adjust the interface depending on the emerging cognitive and collaborative needs of the learning episode. This means that educators can adaptively re-design the learning environment during the lesson, rather than purely relying on preemptive learning design thinking. Based on a…

  20. Systematic design and analysis of laser-guide-star adaptive-optics systems for large telescopes

    SciTech Connect

    Gavel, D.T.; Morris, J.R.; Vernon, R.G.

    1994-02-01

    The authors discuss the design of laser-guided adaptive-optics systems for the large, 8-10-m-class telescopes. Through proper choice of system components and optimized system design, the laser power that is needed at the astronomical site can be kept to a minimum. 37 refs., 9 figs., 3 tabs.

  1. Key techniques and applications of adaptive growth method for stiffener layout design of plates and shells

    NASA Astrophysics Data System (ADS)

    Ding, Xiaohong; Ji, Xuerong; Ma, Man; Hou, Jianyun

    2013-11-01

    The application of the adaptive growth method is limited because several key techniques during the design process need manual intervention of designers. Key techniques of the method including the ground structure construction and seed selection are studied, so as to make it possible to improve the effectiveness and applicability of the adaptive growth method in stiffener layout design optimization of plates and shells. Three schemes of ground structures, which are comprised by different shell elements and beam elements, are proposed. It is found that the main stiffener layouts resulted from different ground structures are almost the same, but the ground structure comprised by 8-nodes shell elements and both 3-nodes and 2-nodes beam elements can result in clearest stiffener layout, and has good adaptability and low computational cost. An automatic seed selection approach is proposed, which is based on such selection rules that the seeds should be positioned on where the structural strain energy is great for the minimum compliance problem, and satisfy the dispersancy requirement. The adaptive growth method with the suggested key techniques is integrated into an ANSYS-based program, which provides a design tool for the stiffener layout design optimization of plates and shells. Typical design examples, including plate and shell structures to achieve minimum compliance and maximum bulking stability are illustrated. In addition, as a practical mechanical structural design example, the stiffener layout of an inlet structure for a large-scale electrostatic precipitator is also demonstrated. The design results show that the adaptive growth method integrated with the suggested key techniques can effectively and flexibly deal with stiffener layout design problem for plates and shells with complex geometrical shape and loading conditions to achieve various design objectives, thus it provides a new solution method for engineering structural topology design optimization.

  2. Simple adaptive control system design for a quadrotor with an internal PFC

    NASA Astrophysics Data System (ADS)

    Mizumoto, Ikuro; Nakamura, Takuto; Kumon, Makoto; Takagi, Taro

    2014-12-01

    The paper deals with an adaptive control system design problem for a four rotor helicopter or quadrotor. A simple adaptive control design scheme with a parallel feedforward compensator (PFC) in the internal loop of the considered quadrotor will be proposed based on the backstepping strategy. As is well known, the backstepping control strategy is one of the advanced control strategy for nonlinear systems. However, the control algorithm will become complex if the system has higher order relative degrees. We will show that one can skip some design steps of the backstepping method by introducing a PFC in the inner loop of the considered quadrotor, so that the structure of the obtained controller will be simplified and a high gain based adaptive feedback control system will be designed. The effectiveness of the proposed method will be confirmed through numerical simulations.

  3. Simple adaptive control system design for a quadrotor with an internal PFC

    SciTech Connect

    Mizumoto, Ikuro; Nakamura, Takuto; Kumon, Makoto; Takagi, Taro

    2014-12-10

    The paper deals with an adaptive control system design problem for a four rotor helicopter or quadrotor. A simple adaptive control design scheme with a parallel feedforward compensator (PFC) in the internal loop of the considered quadrotor will be proposed based on the backstepping strategy. As is well known, the backstepping control strategy is one of the advanced control strategy for nonlinear systems. However, the control algorithm will become complex if the system has higher order relative degrees. We will show that one can skip some design steps of the backstepping method by introducing a PFC in the inner loop of the considered quadrotor, so that the structure of the obtained controller will be simplified and a high gain based adaptive feedback control system will be designed. The effectiveness of the proposed method will be confirmed through numerical simulations.

  4. The Study and Design of Adaptive Learning System Based on Fuzzy Set Theory

    NASA Astrophysics Data System (ADS)

    Jia, Bing; Zhong, Shaochun; Zheng, Tianyang; Liu, Zhiyong

    Adaptive learning is an effective way to improve the learning outcomes, that is, the selection of learning content and presentation should be adapted to each learner's learning context, learning levels and learning ability. Adaptive Learning System (ALS) can provide effective support for adaptive learning. This paper proposes a new ALS based on fuzzy set theory. It can effectively estimate the learner's knowledge level by test according to learner's target. Then take the factors of learner's cognitive ability and preference into consideration to achieve self-organization and push plan of knowledge. This paper focuses on the design and implementation of domain model and user model in ALS. Experiments confirmed that the system providing adaptive content can effectively help learners to memory the content and improve their comprehension.

  5. A new adaptive algorithm for automated feature extraction in exponentially damped signals for health monitoring of smart structures

    NASA Astrophysics Data System (ADS)

    Qarib, Hossein; Adeli, Hojjat

    2015-12-01

    In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.

  6. Adaptive fuzzy switched control design for uncertain nonholonomic systems with input nonsmooth constraint

    NASA Astrophysics Data System (ADS)

    Li, Yongming; Tong, Shaocheng

    2016-10-01

    In this paper, a fuzzy adaptive switched control approach is proposed for a class of uncertain nonholonomic chained systems with input nonsmooth constraint. In the control design, an auxiliary dynamic system is designed to address the input nonsmooth constraint, and an adaptive switched control strategy is constructed to overcome the uncontrollability problem associated with x0(t0) = 0. By using fuzzy logic systems to tackle unknown nonlinear functions, a fuzzy adaptive control approach is explored based on the adaptive backstepping technique. By constructing the combination approximation technique and using Young's inequality scaling technique, the number of the online learning parameters is reduced to n and the 'explosion of complexity' problem is avoid. It is proved that the proposed method can guarantee that all variables of the closed-loop system converge to a small neighbourhood of zero. Two simulation examples are provided to illustrate the effectiveness of the proposed control approach.

  7. Design and Preliminary Testing of the International Docking Adapter's Peripheral Docking Target

    NASA Technical Reports Server (NTRS)

    Foster, Christopher W.; Blaschak, Johnathan; Eldridge, Erin A.; Brazzel, Jack P.; Spehar, Peter T.

    2015-01-01

    The International Docking Adapter's Peripheral Docking Target (PDT) was designed to allow a docking spacecraft to judge its alignment relative to the docking system. The PDT was designed to be compatible with relative sensors using visible cameras, thermal imagers, or Light Detection and Ranging (LIDAR) technologies. The conceptual design team tested prototype designs and materials to determine the contrast requirements for the features. This paper will discuss the design of the PDT, the methodology and results of the tests, and the conclusions pertaining to PDT design that were drawn from testing.

  8. Collaborative design for automated DNA storage that allows for rapid, accurate, large-scale studies.

    PubMed

    Mahan, Scott; Ardlie, Kristin G; Krenitsky, Kevin F; Walsh, Gary; Clough, Graham

    2004-12-01

    Genomics Collaborative, Inc., a division of Sera Care Life Sciences, Inc. (Cambridge, MA), is among the first commercial entities in the world to enable genetic research on an industrial scale via its Large Scale Global Repository, a biobank of human specimens collected for research purposes. With the demand for large-scale DNA studies increasing, decisions about the strategic direction of sample storage and collection must be made to create a sound plan to support continued demands for drug discovery. Reported here is the approach used by Genomics Collaborative to automate its DNA processing, storage, and retrieval.

  9. Advances in light-curing adhesives: II. Solutions designed for automation

    NASA Astrophysics Data System (ADS)

    Bachmann, Andy

    2002-09-01

    The promise of photonics is to provide the next affordable leap in communications and information technology. Low yields and high unit costs are a barrier to mass production. Assembling photonic components economically represents one of the most demanding manufacturing processes in industry today due to the challenges in yield, cycle time and material costs. Manually assembling photonic components will not support the industry's growth rate. Achieving the promise of long term success demands successful automation strategy. Understanding cycle time and throughput and focusing on yield will allow the development of the lowest cost, most successful assembly of photonic, as well as electronic, components.

  10. Adaptive critic autopilot design of bank-to-turn missiles using fuzzy basis function networks.

    PubMed

    Lin, Chuan-Kai

    2005-04-01

    A new adaptive critic autopilot design for bank-to-turn missiles is presented. In this paper, the architecture of adaptive critic learning scheme contains a fuzzy-basis-function-network based associative search element (ASE), which is employed to approximate nonlinear and complex functions of bank-to-turn missiles, and an adaptive critic element (ACE) generating the reinforcement signal to tune the associative search element. In the design of the adaptive critic autopilot, the control law receives signals from a fixed gain controller, an ASE and an adaptive robust element, which can eliminate approximation errors and disturbances. Traditional adaptive critic reinforcement learning is the problem faced by an agent that must learn behavior through trial-and-error interactions with a dynamic environment, however, the proposed tuning algorithm can significantly shorten the learning time by online tuning all parameters of fuzzy basis functions and weights of ASE and ACE. Moreover, the weight updating law derived from the Lyapunov stability theory is capable of guaranteeing both tracking performance and stability. Computer simulation results confirm the effectiveness of the proposed adaptive critic autopilot.

  11. Multivariable output feedback robust adaptive tracking control design for a class of delayed systems

    NASA Astrophysics Data System (ADS)

    Mirkin, Boris; Gutman, Per-Olof

    2015-02-01

    In this paper, we develop a model reference adaptive control scheme for a class of multi-input multi-output nonlinearly perturbed dynamic systems with unknown time-varying state delays which is also robust with respect to an external disturbance with unknown bounds. The output feedback adaptive control scheme uses feedback actions only, and thus does not require a direct measurement of the command or disturbance signals. A suitable Lyapunov-Krasovskii type functional is introduced to design the adaptation algorithms and to prove stability.

  12. Low Level Waste Conceptual Design Adaption to Poor Geological Conditions

    SciTech Connect

    Bell, J.; Drimmer, D.; Giovannini, A.; Manfroy, P.; Maquet, F.; Schittekat, J.; Van Cotthem, A.; Van Echelpoel, E.

    2002-02-26

    Since the early eighties, several studies have been carried out in Belgium with respect to a repository for the final disposal of low-level radioactive waste (LLW). In 1998, the Belgian Government decided to restrict future investigations to the four existing nuclear sites in Belgium or sites that might show interest. So far, only two existing nuclear sites have been thoroughly investigated from a geological and hydrogeological point of view. These sites are located in the North-East (Mol-Dessel) and in the mid part (Fleurus-Farciennes) of the country. Both sites have the disadvantage of presenting poor geological and hydrogeological conditions, which are rather unfavorable to accommodate a surface disposal facility for LLW. The underground of the Mol-Dessel site consists of neogene sand layers of about 180 m thick which cover a 100 meters thick clay layer. These neogene sands contain, at 20 m depth, a thin clayey layer. The groundwater level is quite close to the surface (0-2m) and finally, the topography is almost totally flat. The upper layer of the Fleurus-Farciennes site consists of 10 m silt with poor geomechanical characteristics, overlying sands (only a few meters thick) and Westphalian shales between 15 and 20 m depth. The Westphalian shales are tectonized and strongly weathered. In the past, coal seams were mined out. This activity induced locally important surface subsidence. For both nuclear sites that were investigated, a conceptual design was made that could allow any unfavorable geological or hydrogeological conditions of the site to be overcome. In Fleurus-Farciennes, for instance, the proposed conceptual design of the repository is quite original. It is composed of a shallow, buried concrete cylinder, surrounded by an accessible concrete ring, which allows permanent inspection and control during the whole lifetime of the repository. Stability and drainage systems should be independent of potential differential settlements an d subsidences

  13. An automated instrument for human STR identification: design, characterization, and experimental validation.

    PubMed

    Hurth, Cedric; Smith, Stanley D; Nordquist, Alan R; Lenigk, Ralf; Duane, Brett; Nguyen, David; Surve, Amol; Hopwood, Andrew J; Estes, Matthew D; Yang, Jianing; Cai, Zhi; Chen, Xiaojia; Lee-Edghill, John G; Moran, Nina; Elliott, Keith; Tully, Gillian; Zenhausern, Frederic

    2010-10-01

    The microfluidic integration of an entire DNA analysis workflow on a fully integrated miniaturized instrument is reported using lab-on-a-chip automation to perform DNA fingerprinting compatible with CODIS standard relevant to the forensic community. The instrument aims to improve the cost, duration, and ease of use to perform a "sample-to-profile" analysis with no need for human intervention. The present publication describes the operation of the three major components of the system: the electronic control components, the microfluidic cartridge and CE microchip, and the optical excitation/detection module. Experimental details are given to characterize the level of performance, stability, reliability, accuracy, and sensitivity of the prototype system. A typical temperature profile from a PCR amplification process and an electropherogram of a commercial size standard (GeneScan 500™, Applied Biosystems) separation are shown to assess the relevance of the instrument to forensic applications. Finally, we present a profile from an automated integrated run where lysed cells from a buccal swab were introduced in the system and no further human intervention was required to complete the analysis. PMID:20931618

  14. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    SciTech Connect

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  15. Adaptation of NASA technology for the optimum design of orthopedic knee implants.

    PubMed

    Saravanos, D A; Mraz, P J; Davy, D T; Hopkins, D A

    1991-03-01

    NASA technology originally developed for designing aircraft turbine-engine blades has been adapted and applied to orthopedic knee implants. This article describes a method for tailoring an implant for optimal interaction with the environment of the tibia. The implant components are designed to control stresses in the bone for minimizing bone degradation and preventing failures. Engineers expect the tailoring system to improve knee prosthesis design and allow customized implants for individual patients. PMID:10150099

  16. Design of adaptive steganographic schemes for digital images

    NASA Astrophysics Data System (ADS)

    Filler, Tomás; Fridrich, Jessica

    2011-02-01

    Most steganographic schemes for real digital media embed messages by minimizing a suitably defined distortion function. In practice, this is often realized by syndrome codes which offer near-optimal rate-distortion performance. However, the distortion functions are designed heuristically and the resulting steganographic algorithms are thus suboptimal. In this paper, we present a practical framework for optimizing the parameters of additive distortion functions to minimize statistical detectability. We apply the framework to digital images in both spatial and DCT domain by first defining a rich parametric model which assigns a cost of making a change at every cover element based on its neighborhood. Then, we present a practical method for optimizing the parameters with respect to a chosen detection metric and feature space. We show that the size of the margin between support vectors in soft-margin SVMs leads to a fast detection metric and that methods minimizing the margin tend to be more secure w.r.t. blind steganalysis. The parameters obtained by the Nelder-Mead simplex-reflection algorithm for spatial and DCT-domain images are presented and the new embedding methods are tested by blind steganalyzers utilizing various feature sets. Experimental results show that as few as 80 images are sufficient for obtaining good candidates for parameters of the cost model, which allows us to speed up the parameter search.

  17. GASICA: generic automated stress induction and control application design of an application for controlling the stress state

    PubMed Central

    van der Vijgh, Benny; Beun, Robbert J.; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms. PMID:25538554

  18. IMPACT OF CANAL DESIGN LIMITATIONS ON WATER DELIVERY OPERATIONS AND AUTOMATION

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Irrigation canals are often designed for water transmission. The design engineer simply ensures that the canal will pass the maximum design discharge. However, irrigation canals frequently operated far below design capacity. Because demands and the distribution of flow at bifurcations (branch points...

  19. Multiobjective control design including performance robustness for gust alleviation of a wing with adaptive material actuators

    NASA Astrophysics Data System (ADS)

    Layton, Jeffrey B.

    1997-06-01

    The goal of this paper is to examine the use of covariance control to directly design reduced-order multi-objective controllers for gust alleviation using adaptive materials as the control effector. It will use piezoelectric actuators as control effectors in a finite element model of a full-size wing model. More precisely, the finite element model is of the F-16 Agile Falcon/Active Flexible Wing that is modified to use piezoelectric actuators as control effectors. The paper will also examine the interacting roles of important control design constraints and objectives for designing an aeroservoelastic system. The paper will also present some results of multiobjective control design for the model, illustrating the benefits and complexity of modern practical control design for aeroservoelastic systems that use adaptive materials for actuation.

  20. Tools for Designing, Evaluating, and Certifying NextGen Technologies and Procedures: Automation Roles and Responsibilities

    NASA Technical Reports Server (NTRS)

    Kanki, Barbara G.

    2011-01-01

    Barbara Kanki from NASA Ames Research Center will discuss research that focuses on the collaborations between pilots, air traffic controllers and dispatchers that will change in NextGen systems as automation increases and roles and responsibilities change. The approach taken by this NASA Ames team is to build a collaborative systems assessment template (CSAT) based on detailed task descriptions within each system to establish a baseline of the current operations. The collaborative content and context are delineated through the review of regulatory and advisory materials, policies, procedures and documented practices as augmented by field observations and interviews. The CSAT is developed to aid the assessment of key human factors and performance tradeoffs that result from considering different collaborative arrangements under NextGen system changes. In theory, the CSAT product may be applied to any NextGen application (such as Trajectory Based Operations) with specified ground and aircraft capabilities.

  1. Design of a Model Reference Adaptive Controller for an Unmanned Air Vehicle

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Matsutani, Megumi; Annaswamy, Anuradha M.

    2010-01-01

    This paper presents the "Adaptive Control Technology for Safe Flight (ACTS)" architecture, which consists of a non-adaptive controller that provides satisfactory performance under nominal flying conditions, and an adaptive controller that provides robustness under off nominal ones. The design and implementation procedures of both controllers are presented. The aim of these procedures, which encompass both theoretical and practical considerations, is to develop a controller suitable for flight. The ACTS architecture is applied to the Generic Transport Model developed by NASA-Langley Research Center. The GTM is a dynamically scaled test model of a transport aircraft for which a flight-test article and a high-fidelity simulation are available. The nominal controller at the core of the ACTS architecture has a multivariable LQR-PI structure while the adaptive one has a direct, model reference structure. The main control surfaces as well as the throttles are used as control inputs. The inclusion of the latter alleviates the pilot s workload by eliminating the need for cancelling the pitch coupling generated by changes in thrust. Furthermore, the independent usage of the throttles by the adaptive controller enables their use for attitude control. Advantages and potential drawbacks of adaptation are demonstrated by performing high fidelity simulations of a flight-validated controller and of its adaptive augmentation.

  2. Adaptive Tracker Design with Identifier for Pendulum System by Conditional LMI Method and IROA

    NASA Astrophysics Data System (ADS)

    Hwang, Jiing-Dong; Tsai, Zhi-Ren

    This paper proposes a robust adaptive fuzzy PID control scheme augmented with a supervisory controller for unknown systems. In this scheme, a generalized fuzzy model is used to describe a class of unknown systems. The control strategy allows each part of the control law, i.e., a supervisory controller, a compensator, and an adaptive fuzzy PID controller, to be designed incrementally according to different guidelines. The supervisory controller in the outer loop aims at enhancing system robustness in the face of extra disturbances, variation in system parameters, and parameter drift in the adaptation law. Furthermore, an H∞ control design method using the fuzzy Lyapunov function is presented for the design of the initial control gains that guarantees transient performance at the start of closed-loop control, which is generally overlooked in many adaptive control systems. This design of the initial control gains is a compound search strategy called conditional linear matrix inequality (CLMI) approach with IROA (Improved random optimal algorithm), it leads to less complex designs than a standard LMI method by fuzzy Lyapunov function. Numerical studies of the tracking control of an uncertain inverted pendulum system demonstrate the effectiveness of the control strategy. From results of this simulation, the generalized fuzzy model reduces the rule number of T-S fuzzy model indeed.

  3. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  4. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  5. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    PubMed

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  6. A new adaptive design based on Simon's two-stage optimal design for phase II clinical trials.

    PubMed

    Jin, Hua; Wei, Zhen

    2012-11-01

    Phase II clinical trials are conducted to determine whether a new agent or drug regimen has sufficient promise in treating cancer to merit further testing in larger groups of patients. Both ethical and practical considerations often require early termination of phase II trials if early results clearly indicate that the new regimen is not active or worthy of further investigation. Simon's two-stage designs (1989) are common methods for conducting phase II studies investigating new cancer therapies. Banerjee and Tsiatis (2006) proposed an adaptive two-stage design which allows the sample size at the second stage to depend on the results at the first stage. Their design is more flexible than Simon's, but it is somewhat counter-intuitive: as the response in the first stage increases, the second-stage sample size increases till a certain point and then abruptly becomes zero. In this paper, based on Simon's two-stage optimal design, we propose a new adaptive one which depends on the first stage results using the restrict conditions the conditional type I error and the conditional power. Comparisons are made between Banerjee and Tsiatis' results and our new adaptive designs. PMID:22772088

  7. Evaluation of green infrastructure designs using the Automated Geospatial Watershed Assessment Tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In arid and semi-arid regions, green infrastructure (GI) designs can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwater, addressi...

  8. Experiences with an adaptive design for a dose-finding study in patients with osteoarthritis.

    PubMed

    Miller, Frank; Björnsson, Marcus; Svensson, Ola; Karlsten, Rolf

    2014-03-01

    Dose-finding studies in non-oncology areas are usually conducted in Phase II of the development process of a new potential medicine and it is key to choose a good design for such a study, as the results will decide if and how to proceed to Phase III. The present article has focus on the design of a dose-finding study for pain in osteoarthritis patients treated with the TRPV1 antagonist AZD1386. We describe different design alternatives in the planning of this study, the reasoning for choosing the adaptive design and experiences with conduct and interim analysis. Three alternatives were proposed: one single dose-finding study with parallel design, a programme with a smaller Phase IIa study followed by a Phase IIb dose-finding study, and an adaptive dose-finding study. We describe these alternatives in detail and explain why the adaptive design was chosen for the study. We give insights in design aspects of the adaptive study, which need to be pre-planned, like interim decision criteria, statistical analysis method and setup of a Data Monitoring Committee. Based on the interim analysis it was recommended to stop the study for futility since AZD1386 showed no significant pain decrease based on the primary variable. We discuss results and experiences from the conduct of the study with the novel design approach. Huge cost savings have been done compared to if the option with one dose-finding design for Phase II had been chosen. However, we point out several challenges with this approach.

  9. An automated tool for evaluating compliance and providing assistance with building energy standards during design

    SciTech Connect

    Quadrel, R.W.; Brambley, M.R.; Stratton, R.C.

    1992-04-30

    In an effort to encourage the maximum cost-effective level of energy efficiency in new building design, energy-efficiency standards have become more location-specific and performance-based. As a result, standards often provide more than one path for ensuring and demonstrating that a design complies, but at the cost of increased complexity. In addition, the burden of remedying a noncompliant design rests on the designers` knowledge and experience, with only general guidance provided by the standards. As part of efforts in the US Department of Energy`s (DOE`s) Advanced Energy Design and Operation Technologies (AEDOT) project, a team at DOE`s Pacific Northwest Laboratory is developing a computer program known as the Energy Standards Intelligent Design Tool (ES-IDT). The ES-IDT is one component of a prototype computer-based building design environment. It performs automatic compliance checking for parts of ASHRAE/IES Standard 90.1-1989 and provides designers assistance in bringing noncomplying designs into compliance. This paper describes the ES-IDT, the functions it provides, and how it is integrated into the design process via the AEDOT prototype building design environment. 9 refs.

  10. An automated tool for evaluating compliance and providing assistance with building energy standards during design

    SciTech Connect

    Quadrel, R.W.; Brambley, M.R.; Stratton, R.C.

    1992-04-30

    In an effort to encourage the maximum cost-effective level of energy efficiency in new building design, energy-efficiency standards have become more location-specific and performance-based. As a result, standards often provide more than one path for ensuring and demonstrating that a design complies, but at the cost of increased complexity. In addition, the burden of remedying a noncompliant design rests on the designers' knowledge and experience, with only general guidance provided by the standards. As part of efforts in the US Department of Energy's (DOE's) Advanced Energy Design and Operation Technologies (AEDOT) project, a team at DOE's Pacific Northwest Laboratory is developing a computer program known as the Energy Standards Intelligent Design Tool (ES-IDT). The ES-IDT is one component of a prototype computer-based building design environment. It performs automatic compliance checking for parts of ASHRAE/IES Standard 90.1-1989 and provides designers assistance in bringing noncomplying designs into compliance. This paper describes the ES-IDT, the functions it provides, and how it is integrated into the design process via the AEDOT prototype building design environment. 9 refs.

  11. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  12. Design of computer-generated beam-shaping holograms by iterative finite-element mesh adaption.

    PubMed

    Dresel, T; Beyerlein, M; Schwider, J

    1996-12-10

    Computer-generated phase-only holograms can be used for laser beam shaping, i.e., for focusing a given aperture with intensity and phase distributions into a pregiven intensity pattern in their focal planes. A numerical approach based on iterative finite-element mesh adaption permits the design of appropriate phase functions for the task of focusing into two-dimensional reconstruction patterns. Both the hologram aperture and the reconstruction pattern are covered by mesh mappings. An iterative procedure delivers meshes with intensities equally distributed over the constituting elements. This design algorithm adds new elementary focuser functions to what we call object-oriented hologram design. Some design examples are discussed.

  13. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  14. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  15. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  16. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  17. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  18. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  19. Demonstration/field study of new designs of automated gas chromatographs in Connecticut and other locations, 1992. Final report

    SciTech Connect

    Holdren, M.W.; Smith, D.L.; Pollack, A.J.; Pate, A.D.

    1993-02-01

    The objectives of the study were to install, test and demonstrate two automated gas chromatographic (GC) systems to state and regional EPA groups. The Dynatherm/Hewlett Packard GC system was designed for the measurement of the 41 toxic compounds listed in EPA's Compendium of Methods for Method TO-14. The second system was a Perkin Elmer GC configured for the analysis of the 55 ozone precursor compounds identified in the EPA Technical Assistance Document No. EPA/600-8-91/215. Both GC systems performed well during the field evaluations with data capture of 98 percent. A method quantitation limit (MQL) of 0.5 ppbv was obtained for most of the target compounds. In examining daily control check runs, the variation of corrected GC retention times for each instrument and detector ranged from 0.023 to 0.044 minutes.

  20. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol

    PubMed Central

    Azar, Kristen MJ; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-01

    Background In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Objective Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Methods Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. Results A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. Conclusions The randomized trial will provide rigorous evidence regarding the efficacy of

  1. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  2. Direct and Inverse Problems of Item Pool Design for Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Belov, Dmitry I.; Armstrong, Ronald D.

    2009-01-01

    The recent literature on computerized adaptive testing (CAT) has developed methods for creating CAT item pools from a large master pool. Each CAT pool is designed as a set of nonoverlapping forms reflecting the skill levels of an assumed population of test takers. This article presents a Monte Carlo method to obtain these CAT pools and discusses…

  3. Development of an Assistance Environment for Tutors Based on a Co-Adaptive Design Approach

    ERIC Educational Resources Information Center

    Lavoue, Elise; George, Sebastien; Prevot, Patrick

    2012-01-01

    In this article, we present a co-adaptive design approach named TE-Cap (Tutoring Experience Capitalisation) that we applied for the development of an assistance environment for tutors. Since tasks assigned to tutors in educational contexts are not well defined, we are developing an environment which responds to needs which are not precisely…

  4. Can Approaches to Research in Art and Design Be Beneficially Adapted for Research into Higher Education?

    ERIC Educational Resources Information Center

    Trowler, Paul

    2013-01-01

    This paper examines the research practices in Art and Design that are distinctively different from those common in research into higher education outside those fields. It considers whether and what benefit could be derived from their adaptation by the latter. The paper also examines the factors that are conducive and obstructive to adaptive…

  5. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  6. Automated divertor target design by adjoint shape sensitivity analysis and a one-shot method

    SciTech Connect

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2014-12-01

    As magnetic confinement fusion progresses towards the development of first reactor-scale devices, computational tokamak divertor design is a topic of high priority. Presently, edge plasma codes are used in a forward approach, where magnetic field and divertor geometry are manually adjusted to meet design requirements. Due to the complex edge plasma flows and large number of design variables, this method is computationally very demanding. On the other hand, efficient optimization-based design strategies have been developed in computational aerodynamics and fluid mechanics. Such an optimization approach to divertor target shape design is elaborated in the present paper. A general formulation of the design problems is given, and conditions characterizing the optimal designs are formulated. Using a continuous adjoint framework, design sensitivities can be computed at a cost of only two edge plasma simulations, independent of the number of design variables. Furthermore, by using a one-shot method the entire optimization problem can be solved at an equivalent cost of only a few forward simulations. The methodology is applied to target shape design for uniform power load, in simplified edge plasma geometry.

  7. Automation and adaptation: Nurses' problem-solving behavior following the implementation of bar coded medication administration technology.

    PubMed

    Holden, Richard J; Rivera-Rodriguez, A Joy; Faye, Héléne; Scanlon, Matthew C; Karsh, Ben-Tzion

    2013-08-01

    The most common change facing nurses today is new technology, particularly bar coded medication administration technology (BCMA). However, there is a dearth of knowledge on how BCMA alters nursing work. This study investigated how BCMA technology affected nursing work, particularly nurses' operational problem-solving behavior. Cognitive systems engineering observations and interviews were conducted after the implementation of BCMA in three nursing units of a freestanding pediatric hospital. Problem-solving behavior, associated problems, and goals, were specifically defined and extracted from observed episodes of care. Three broad themes regarding BCMA's impact on problem solving were identified. First, BCMA allowed nurses to invent new problem-solving behavior to deal with pre-existing problems. Second, BCMA made it difficult or impossible to apply some problem-solving behaviors that were commonly used pre-BCMA, often requiring nurses to use potentially risky workarounds to achieve their goals. Third, BCMA created new problems that nurses were either able to solve using familiar or novel problem-solving behaviors, or unable to solve effectively. Results from this study shed light on hidden hazards and suggest three critical design needs: (1) ecologically valid design; (2) anticipatory control; and (3) basic usability. Principled studies of the actual nature of clinicians' work, including problem solving, are necessary to uncover hidden hazards and to inform health information technology design and redesign.

  8. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI.

    PubMed

    Akamatsu, G; Ikari, Y; Ohnishi, A; Nishida, H; Aita, K; Sasaki, M; Yamamoto, Y; Sasaki, M; Senda, M

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer's disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain (11)C-PiB PET were examined. The (11)C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The (11)C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize (11)C-PiB scans as positive or negative. Significant correlation was observed between the

  9. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI

    NASA Astrophysics Data System (ADS)

    Akamatsu, G.; Ikari, Y.; Ohnishi, A.; Nishida, H.; Aita, K.; Sasaki, M.; Yamamoto, Y.; Sasaki, M.; Senda, M.

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer’s disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain 11C-PiB PET were examined. The 11C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The 11C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize 11C-PiB scans as positive or negative. Significant correlation was observed between the SUVRs

  10. An introduction to the BANNING design automation system for shuttle microelectronic hardware development

    NASA Technical Reports Server (NTRS)

    Mcgrady, W. J.

    1979-01-01

    The BANNING MOS design system is presented. It complements rather than supplant the normal design activities associated with the design and fabrication of low-power digital electronic equipment. BANNING is user-oriented and requires no programming experience to use effectively. It provides the user a simulation capability to aid in his circuit design and it eliminates most of the manual operations involved in the layout and artwork generation of integrated circuits. An example of its operation is given and some additional background reading is provided.

  11. Flexible design of two-stage adaptive procedures for phase III clinical trials.

    PubMed

    Koyama, Tatsuki

    2007-07-01

    The recent popularity of two-stage adaptive designs has fueled a number of proposals for their use in phase III clinical trials. Many of these designs assign certain restrictive functional forms to the design elements of stage 2, such as sample size, critical value and conditional power functions. We propose a more flexible method of design without imposing any particular functional forms on these design elements. Our methodology permits specification of a design based on either conditional or unconditional characteristics, and allows accommodation of sample size limit. Furthermore, we show how to compute the P value, confidence interval and a reasonable point estimate for any design that can be placed under the proposed framework. PMID:17307399

  12. Design of artificial genetic regulatory networks with multiple delayed adaptive responses*

    NASA Astrophysics Data System (ADS)

    Kaluza, Pablo; Inoue, Masayo

    2016-06-01

    Genetic regulatory networks with adaptive responses are widely studied in biology. Usually, models consisting only of a few nodes have been considered. They present one input receptor for activation and one output node where the adaptive response is computed. In this work, we design genetic regulatory networks with many receptors and many output nodes able to produce delayed adaptive responses. This design is performed by using an evolutionary algorithm of mutations and selections that minimizes an error function defined by the adaptive response in signal shapes. We present several examples of network constructions with a predefined required set of adaptive delayed responses. We show that an output node can have different kinds of responses as a function of the activated receptor. Additionally, complex network structures are presented since processing nodes can be involved in several input-output pathways. Supplementary material in the form of one nets file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-70172-9

  13. Analysis and design of a high power laser adaptive phased array transmitter

    NASA Technical Reports Server (NTRS)

    Mevers, G. E.; Soohoo, J. F.; Winocur, J.; Massie, N. A.; Southwell, W. H.; Brandewie, R. A.; Hayes, C. L.

    1977-01-01

    The feasibility of delivering substantial quantities of optical power to a satellite in low earth orbit from a ground based high energy laser (HEL) coupled to an adaptive antenna was investigated. Diffraction effects, atmospheric transmission efficiency, adaptive compensation for atmospheric turbulence effects, including the servo bandwidth requirements for this correction, and the adaptive compensation for thermal blooming were examined. To evaluate possible HEL sources, atmospheric investigations were performed for the CO2, (C-12)(O-18)2 isotope, CO and DF wavelengths using output antenna locations of both sea level and mountain top. Results indicate that both excellent atmospheric and adaption efficiency can be obtained for mountain top operation with a micron isotope laser operating at 9.1 um, or a CO laser operating single line (P10) at about 5.0 (C-12)(O-18)2um, which was a close second in the evaluation. Four adaptive power transmitter system concepts were generated and evaluated, based on overall system efficiency, reliability, size and weight, advanced technology requirements and potential cost. A multiple source phased array was selected for detailed conceptual design. The system uses a unique adaption technique of phase locking independent laser oscillators which allows it to be both relatively inexpensive and most reliable with a predicted overall power transfer efficiency of 53%.

  14. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  15. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  16. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering.

    PubMed

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-07-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins' stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard.

  17. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    NASA Astrophysics Data System (ADS)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  18. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering.

    PubMed

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-07-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins' stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard. PMID:27174934

  19. HotSpot Wizard 2.0: automated design of site-specific mutations and smart libraries in protein engineering

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Sebestova, Eva; Vavra, Ondrej; Musil, Milos; Brezovsky, Jan; Damborsky, Jiri

    2016-01-01

    HotSpot Wizard 2.0 is a web server for automated identification of hot spots and design of smart libraries for engineering proteins’ stability, catalytic activity, substrate specificity and enantioselectivity. The server integrates sequence, structural and evolutionary information obtained from 3 databases and 20 computational tools. Users are guided through the processes of selecting hot spots using four different protein engineering strategies and optimizing the resulting library's size by narrowing down a set of substitutions at individual randomized positions. The only required input is a query protein structure. The results of the calculations are mapped onto the protein's structure and visualized with a JSmol applet. HotSpot Wizard lists annotated residues suitable for mutagenesis and can automatically design appropriate codons for each implemented strategy. Overall, HotSpot Wizard provides comprehensive annotations of protein structures and assists protein engineers with the rational design of site-specific mutations and focused libraries. It is freely available at http://loschmidt.chemi.muni.cz/hotspotwizard. PMID:27174934

  20. Design and progress toward a multi-conjugate adaptive optics system for distributed aberration correction

    SciTech Connect

    Baker, K; Olivier, S; Tucker, J; Silva, D; Gavel, D; Lim, R; Gratrix, E

    2004-08-17

    This article investigates the use of a multi-conjugate adaptive optics system to improve the field-of-view for the system. The emphasis of this research is to develop techniques to improve the performance of optical systems with applications to horizontal imaging. The design and wave optics simulations of the proposed system are given. Preliminary results from the multi-conjugate adaptive optics system are also presented. The experimental system utilizes a liquid-crystal spatial light modulator and an interferometric wave-front sensor for correction and sensing of the phase aberrations, respectively.

  1. Adapting computational optimization concepts from aeronautics to nuclear fusion reactor design

    NASA Astrophysics Data System (ADS)

    Dekeyser, W.; Reiter, D.; Baelmans, M.

    2012-10-01

    Even on the most powerful supercomputers available today, computational nuclear fusion reactor divertor design is extremely CPU demanding, not least due to the large number of design variables and the hybrid micro-macro character of the flows. Therefore, automated design methods based on optimization can greatly assist current reactor design studies. Over the past decades, "adjoint methods" for shape optimization have proven their virtue in the field of aerodynamics. Applications include drag reduction for wing and wing-body configurations. Here we demonstrate that also for divertor design, these optimization methods have a large potential. Specifically, we apply the continuous adjoint method to the optimization of the divertor geometry in a 2D poloidal cross section of an axisymmetric tokamak device (as, e.g., JET and ITER), using a simplified model for the plasma edge. The design objective is to spread the target material heat load as much as possible by controlling the shape of the divertor, while maintaining the full helium ash removal capabilities of the vacuum pumping system.

  2. Design and inference for the intent-to-treat principle using adaptive treatment.

    PubMed

    Dawson, Ree; Lavori, Philip W

    2015-04-30

    Nonadherence to assigned treatment jeopardizes the power and interpretability of intent-to-treat comparisons from clinical trial data and continues to be an issue for effectiveness studies, despite their pragmatic emphasis. We posit that new approaches to design need to complement developments in methods for causal inference to address nonadherence, in both experimental and practice settings. This paper considers the conventional study design for psychiatric research and other medical contexts, in which subjects are randomized to treatments that are fixed throughout the trial and presents an alternative that converts the fixed treatments into an adaptive intervention that reflects best practice. The key element is the introduction of an adaptive decision point midway into the study to address a patient's reluctance to remain on treatment before completing a full-length trial of medication. The clinical uncertainty about the appropriate adaptation prompts a second randomization at the new decision point to evaluate relevant options. Additionally, the standard 'all-or-none' principal stratification (PS) framework is applied to the first stage of the design to address treatment discontinuation that occurs too early for a midtrial adaptation. Drawing upon the adaptive intervention features, we develop assumptions to identify the PS causal estimand and to introduce restrictions on outcome distributions to simplify expectation-maximization calculations. We evaluate the performance of the PS setup, with particular attention to the role played by a binary covariate. The results emphasize the importance of collecting covariate data for use in design and analysis. We consider the generality of our approach beyond the setting of psychiatric research. PMID:25581413

  3. Design and inference for the intent-to-treat principle using adaptive treatment.

    PubMed

    Dawson, Ree; Lavori, Philip W

    2015-04-30

    Nonadherence to assigned treatment jeopardizes the power and interpretability of intent-to-treat comparisons from clinical trial data and continues to be an issue for effectiveness studies, despite their pragmatic emphasis. We posit that new approaches to design need to complement developments in methods for causal inference to address nonadherence, in both experimental and practice settings. This paper considers the conventional study design for psychiatric research and other medical contexts, in which subjects are randomized to treatments that are fixed throughout the trial and presents an alternative that converts the fixed treatments into an adaptive intervention that reflects best practice. The key element is the introduction of an adaptive decision point midway into the study to address a patient's reluctance to remain on treatment before completing a full-length trial of medication. The clinical uncertainty about the appropriate adaptation prompts a second randomization at the new decision point to evaluate relevant options. Additionally, the standard 'all-or-none' principal stratification (PS) framework is applied to the first stage of the design to address treatment discontinuation that occurs too early for a midtrial adaptation. Drawing upon the adaptive intervention features, we develop assumptions to identify the PS causal estimand and to introduce restrictions on outcome distributions to simplify expectation-maximization calculations. We evaluate the performance of the PS setup, with particular attention to the role played by a binary covariate. The results emphasize the importance of collecting covariate data for use in design and analysis. We consider the generality of our approach beyond the setting of psychiatric research.

  4. Green Infrastructure Design Evaluation Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  5. Evaluation of Green Infrastructure Designs Using the Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...

  6. Automated design and optimization of flexible booster autopilots via linear programming. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Hauser, F. D.; Szollosi, G. D.; Lakin, W. S.

    1972-01-01

    COEBRA, the Computerized Optimization of Elastic Booster Autopilots, is an autopilot design program. The bulk of the design criteria is presented in the form of minimum allowed gain/phase stability margins. COEBRA has two optimization phases: (1) a phase to maximize stability margins; and (2) a phase to optimize structural bending moment load relief capability in the presence of minimum requirements on gain/phase stability margins.

  7. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  8. Adaptive sampling in two-phase designs: a biomarker study for progression in arthritis

    PubMed Central

    McIsaac, Michael A; Cook, Richard J

    2015-01-01

    Response-dependent two-phase designs are used increasingly often in epidemiological studies to ensure sampling strategies offer good statistical efficiency while working within resource constraints. Optimal response-dependent two-phase designs are difficult to implement, however, as they require specification of unknown parameters. We propose adaptive two-phase designs that exploit information from an internal pilot study to approximate the optimal sampling scheme for an analysis based on mean score estimating equations. The frequency properties of estimators arising from this design are assessed through simulation, and they are shown to be similar to those from optimal designs. The design procedure is then illustrated through application to a motivating biomarker study in an ongoing rheumatology research program. Copyright © 2015 © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25951124

  9. The Design and Production of a Procedure Training Aid Using the Procedure Learning Format and the Computer Automated Page Layout (PLA) Routine. Technical Note 12-83.

    ERIC Educational Resources Information Center

    Terrell, William R.; And Others

    This report describes a field application of the Computer Automated Page Layout (PLA) system to the development of a procedure training aid for the SH-3D/H Helicopter, as part of the Training Analysis and Evaluation Group's (TAEG) ongoing development effort to provide tools for the design and publication of technical training aids in a format…

  10. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  11. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods. PMID:15626603

  12. Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin; Hull, Patrick V.

    2015-01-01

    Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.

  13. Design Framework for an Adaptive MOOC Enhanced by Blended Learning: Supplementary Training and Personalized Learning for Teacher Professional Development

    ERIC Educational Resources Information Center

    Gynther, Karsten

    2016-01-01

    The research project has developed a design framework for an adaptive MOOC that complements the MOOC format with blended learning. The design framework consists of a design model and a series of learning design principles which can be used to design in-service courses for teacher professional development. The framework has been evaluated by…

  14. Automated procedure for design of wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1973-01-01

    A pilot computer program was developed for the design of minimum mass wing structures under flutter, strength, and minimum gage constraints. The wing structure is idealized by finite elements, and second-order piston theory aerodynamics is used in the flutter calculation. Mathematical programing methods are used for the optimization. Computation times during the design process are reduced by three techniques. First, iterative analysis methods used to reduce significantly reanalysis times. Second, the number of design variables is kept small by not using a one-to-one correspondence between finite elements and design variables. Third, a technique for using approximate second derivatives with Newton's method for the optimization is incorporated. The program output is compared witH previous published results. It is found that some flutter characteristics, such as the flutter speed, can display discontinous dependence on the design variables (which are the thicknesses of the structural elements). It is concluded that it is undesirable to use such quantities in the formulation of the flutter constraint.

  15. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  16. Optimization and automation of the semi-submersible platforms mooring design

    SciTech Connect

    Ferrari, J.A. Jr.; Morooka, C.K.

    1994-12-31

    There are a few calculation programs around the world used for determining the main aspects of the Mooring Design of Semi-Submersible Platforms . These programs bold a worldwide acknowledgement and their results are actually reliable. But they require many runs to get a solution that comply with the Classification Society requirements. This paper presents some procedures in order to optimize the semi-submersible mooring design as well as to make it automatic. Regarding the optimization philosophies, the following aspects are treated: (1) the optimization of the platform heading and the mooring pattern based on the spreading of the environmental forces; (2) the searching for the optimum mooring line composition in an automatic mode. Basically, the paper`s main goal is to introduce some methods to find the lowest cost solution for the mooring system in a short time. All of these methods were computationally implemented creating the intelligent system named PROANC, which deals with the semi-submersible mooring design in a quasi-static and deterministic approach. It should be noted that the proposed system exerts a strong appeal as a design tool for feasibility studies of a given oil field and its quasi-static results can be directly applied to a mooring program capable of performing dynamic analysis. Finally some simulations are executed for different water depths and its final results, including the expended time to run, are presented in order to prove the PROANC system wide potential as a design tool.

  17. Design and Performance of an Automated Bioreactor for Cell Culture Experiments in a Microgravity Environment

    NASA Astrophysics Data System (ADS)

    Kim, Youn-Kyu; Park, Seul-Hyun; Lee, Joo-Hee; Choi, Gi-Hyuk

    2015-03-01

    In this paper, we describe the development of a bioreactor for a cell-culture experiment on the International Space Station (ISS). The bioreactor is an experimental device for culturing mouse muscle cells in a microgravity environment. The purpose of the experiment was to assess the impact of microgravity on the muscles to address the possibility of longterm human residence in space. After investigation of previously developed bioreactors, and analysis of the requirements for microgravity cell culture experiments, a bioreactor design is herein proposed that is able to automatically culture 32 samples simultaneously. This reactor design is capable of automatic control of temperature, humidity, and culture-medium injection rate; and satisfies the interface requirements of the ISS. Since bioreactors are vulnerable to cell contamination, the medium-circulation modules were designed to be a completely replaceable, in order to reuse the bioreactor after each experiment. The bioreactor control system is designed to circulate culture media to 32 culture chambers at a maximum speed of 1 ml/min, to maintain the temperature of the reactor at 36°C, and to keep the relative humidity of the reactor above 70%. Because bubbles in the culture media negatively affect cell culture, a de-bubbler unit was provided to eliminate such bubbles. A working model of the reactor was built according to the new design, to verify its performance, and was used to perform a cell culture experiment that confirmed the feasibility of this device.

  18. An Integrated Systems Approach to Designing Climate Change Adaptation Policy in Water Resources

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Malano, H. M.; Davidson, B.; George, B.

    2014-12-01

    Climate change projections are characterised by large uncertainties with rainfall variability being the key challenge in designing adaptation policies. Climate change adaptation in water resources shows all the typical characteristics of 'wicked' problems typified by cognitive uncertainty as new scientific knowledge becomes available, problem instability, knowledge imperfection and strategic uncertainty due to institutional changes that inevitably occur over time. Planning that is characterised by uncertainties and instability requires an approach that can accommodate flexibility and adaptive capacity for decision-making. An ability to take corrective measures in the event that scenarios and responses envisaged initially derive into forms at some future stage. We present an integrated-multidisciplinary and comprehensive framework designed to interface and inform science and decision making in the formulation of water resource management strategies to deal with climate change in the Musi Catchment of Andhra Pradesh, India. At the core of this framework is a dialogue between stakeholders, decision makers and scientists to define a set of plausible responses to an ensemble of climate change scenarios derived from global climate modelling. The modelling framework used to evaluate the resulting combination of climate scenarios and adaptation responses includes the surface and groundwater assessment models (SWAT & MODFLOW) and the water allocation modelling (REALM) to determine the water security of each adaptation strategy. Three climate scenarios extracted from downscaled climate models were selected for evaluation together with four agreed responses—changing cropping patterns, increasing watershed development, changing the volume of groundwater extraction and improving irrigation efficiency. Water security in this context is represented by the combination of level of water availability and its associated security of supply for three economic activities (agriculture

  19. 75 FR 8968 - Draft Guidance for Industry on Adaptive Design Clinical Trials for Drugs and Biologics; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... design clinical trials (i.e., clinical, statistical, regulatory) call for special consideration, when to interact with FDA while planning and conducting adaptive design studies, what information to include in the... study. The draft guidance is intended to assist sponsors in planning and conducting adaptive...

  20. PopupCAD: a tool for automated design, fabrication, and analysis of laminate devices

    NASA Astrophysics Data System (ADS)

    Aukes, Daniel M.; Wood, Robert J.

    2015-05-01

    Recent advances in laminate manufacturing techniques have driven the development of new classes of millimeter-scale sensorized medical devices, robots capable of terrestrial locomotion and sustained flight, and new techniques for sensing and actuation. Recently, the analysis of laminate micro-devices has focused more manufacturability concerns and not on mechanics. Considering the nature of such devices, we draw from existing research in composites, origami kinematics, and finite element methods in order to identify issues related to sequential assembly and self-folding prior to fabrication as well as the stiffness of composite folded systems during operation. These techniques can be useful for understanding how such devices will bend and flex under normal operating conditions, and when added to new design tools like popupCAD, will give designers another means to develop better devices throughout the design process.

  1. Human-Automation Integration: Principle and Method for Design and Evaluation

    NASA Technical Reports Server (NTRS)

    Billman, Dorrit; Feary, Michael

    2012-01-01

    Future space missions will increasingly depend on integration of complex engineered systems with their human operators. It is important to ensure that the systems that are designed and developed do a good job of supporting the needs of the work domain. Our research investigates methods for needs analysis. We included analysis of work products (plans for regulation of the space station) as well as work processes (tasks using current software), in a case study of Attitude Determination and Control Officers (ADCO) planning work. This allows comparing how well different designs match the structure of the work to be supported. Redesigned planning software that better matches the structure of work was developed and experimentally assessed. The new prototype enabled substantially faster and more accurate performance in plan revision tasks. This success suggests the approach to needs assessment and use in design and evaluation is promising, and merits investigatation in future research.

  2. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    NASA Astrophysics Data System (ADS)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  3. Dynamic experiment design regularization approach to adaptive imaging with array radar/SAR sensor systems.

    PubMed

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the "model-free" variational analysis (VA)-based image enhancement approach and the "model-based" descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations.

  4. Dynamic Experiment Design Regularization Approach to Adaptive Imaging with Array Radar/SAR Sensor Systems

    PubMed Central

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the “model-free” variational analysis (VA)-based image enhancement approach and the “model-based” descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations. PMID:22163859

  5. An automated system for chromosome analysis. Volume 1: Goals, system design, and performance

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1975-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and a basis for statistical analysis of quantitative chromosome measurement data is described. The prototype was assembled, tested, and evaluated on clinical material and thoroughly documented.

  6. Automated preliminary design of simplified wing structures to satisfy strength and flutter requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Dexter, C. B.; Stein, M.

    1972-01-01

    A simple structural model of an aircraft wing is used to show the effects of strength (stress) and flutter requirements on the design of minimum-weight aircraft-wing structures. The wing is idealized as an isotropic sandwich plate with a variable cover thickness distribution and a variable depth between covers. Plate theory is used for the structural analysis, and piston theory is used for the unsteady aerodynamics in the flutter analysis. Mathematical programming techniques are used to find the minimum-weight cover thickness distribution which satisfies flutter, strength, and minimum-gage constraints. The method of solution, some sample results, and the computer program used to obtain these results are presented. The results indicate that the cover thickness distribution obtained when designing for the strength requirement alone may be quite different from the cover thickness distribution obtained when designing for either the flutter requirement alone or for both the strength and flutter requirements concurrently. This conclusion emphasizes the need for designing for both flutter and strength from the outset.

  7. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  8. The Zeus Mission Study — An application of automated collaborative design

    NASA Astrophysics Data System (ADS)

    Doyotte, Romain; Love, Stanley G.; Peterson, Craig E.

    1999-11-01

    The purpose of the Zeus Mission Study was threefold. As an element of a graduate course in spacecraft system engineering, its purpose was primarily educational — to allow the students to apply their knowledge in a real mission study. The second purpose was to investigate the feasibility of applying advanced technology (the power antenna and solar electric propulsion concepts) to a challenging mission. Finally, the study allowed evaluation of the benefits of using quality-oriented techniques (Quality Function Deployment (QFD) and Taguchi Methods) for a mission study. To encourage innovation, several constraints were placed on the study from the onset. While the primary goal was to place at least one lander on Europa, the additional constraint of no nuclear power sources posed an additional challenge, particularly when coupled with the mass constraints imposed by using a Delta II class launch vehicle. In spite of these limitations, the team was able to develop a mission and spacecraft design capable of carrying three simple, lightweight, yet capable landers. The science return will more than adequately meet the science goals established QFD was used to determine the optimal choice of instrumentation. The lander design was selected from several competing lander concepts, including rovers. The carrier design was largely dictated by the needs of the propulsion system required to support the mission, although the development of a Project Trades Model (PTM) in software allowed for rapid recalculation of key system parameters as changes were made. Finally, Taguchi Methods (Design of Experiments) were used in conjunction with the PTM allowing for some limited optimization of design features.

  9. Evaluating the Validity of an Automated Device for Asthma Monitoring for Adolescents: Correlational Design

    PubMed Central

    Belyea, Michael J; Sterling, Mark; Bocko, Mark F

    2015-01-01

    Background Symptom monitoring is a cornerstone of asthma self-management. Conventional methods of symptom monitoring have fallen short in producing objective data and eliciting patients’ consistent adherence, particularly in teen patients. We have recently developed an Automated Device for Asthma Monitoring (ADAM) using a consumer mobile device as a platform to facilitate continuous and objective symptom monitoring in adolescents in vivo. Objective The objectives of the study were to evaluate the validity of the device using spirometer data, fractional exhaled nitric oxide (FeNO), existing measures of asthma symptoms/control and health care utilization data, and to examine the sensitivity and specificity of the device in discriminating asthma cases from nonasthma cases. Methods A total of 84 teens (42 teens with a current asthma diagnosis; 42 without asthma) aged between 13 and 17 years participated in the study. All participants used ADAM for 7 consecutive days during which participants with asthma completed an asthma diary two times a day. ADAM recorded the frequency of coughing for 24 hours throughout the 7-day trial. Pearson correlation and multiple regression were used to examine the relationships between ADAM data and asthma control, quality of life, and health care utilization at the time of the 7-day trial and 3 months later. A receiver operating characteristic (ROC) curve analysis was conducted to examine sensitivity and specificity based on the area under the curve (AUC) as an indicator of the device’s capacity to discriminate between asthma versus nonasthma cases. Results ADAM data (cough counts) were negatively associated with forced expiratory volume in first second of expiration (FEV1) (r=–.26, P=.05), forced vital capacity (FVC) (r=–.31, P=.02), and overall asthma control (r=–.41, P=.009) and positively associated with daily activity limitation (r=.46, P=.01), nighttime (r=.40, P=.02) and daytime symptoms (r=.38, P=.02), and health care

  10. Add-on simple adaptive control improves performance of classical control design

    NASA Astrophysics Data System (ADS)

    Weiss, Haim; Rusnak, Ilan

    2014-12-01

    The Simple Adaptive Control (SAC) controls an augmented plant that comprises the true plant with parallel feed-forward. The Almost Strictly Positive Real (ASPR) property of the augmented plant leads to asymptotic following. Prior publications have shown that, based only on the prior knowledge on stabilizability properties of systems (usually available), the parallel feed-forward configuration (PFC) allows adaptive control of realistic systems, even if they are both unstable and non-minimum phase. However, it was commonly thought that the PFC addition requires a price when compared with good linear time invariant (LTI) designs that do not use any addition to the plant. The paper shows that the use of SAC with PFC as Add-On to LTI system design improves the performance. Although SAC directly controls the augmented error, it always gives improved performance, i.e., smaller tracking error and reduced sensitivity to plant disturbance, with respect to the best LTI controller.

  11. Local Laser Strengthening of Steel Sheets for Load Adapted Component Design in Car Body Structures

    NASA Astrophysics Data System (ADS)

    Jahn, Axel; Heitmanek, Marco; Standfuss, Jens; Brenner, Berndt; Wunderlich, Gerd; Donat, Bernd

    The current trend in car body construction concerning light weight design and car safety improvement increasingly requires an adaption of the local material properties on the component load. Martensitic hardenable steels, which are typically used in car body components, show a significant hardening effect, for instance in laser welded seams. This effect can be purposefully used as a local strengthening method. For several steel grades the local strengthening, resulting from a laser remelting process was investigated. The strength in the treated zone was determined at crash relevant strain rates. A load adapted design of complex reinforcement structures was developed for compression and bending loaded tube samples, using numerical simulation of the deformation behavior. Especially for bending loaded parts, the crash energy absorption can be increased significantly by local laser strengthening.

  12. Design of Unstructured Adaptive (UA) NAS Parallel Benchmark Featuring Irregular, Dynamic Memory Accesses

    NASA Technical Reports Server (NTRS)

    Feng, Hui-Yu; VanderWijngaart, Rob; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We describe the design of a new method for the measurement of the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. The method involves the solution of a stylized heat transfer problem on an unstructured, adaptive grid. A Spectral Element Method (SEM) with an adaptive, nonconforming mesh is selected to discretize the transport equation. The relatively high order of the SEM lowers the fraction of wall clock time spent on inter-processor communication, which eases the load balancing task and allows us to concentrate on the memory accesses. The benchmark is designed to be three-dimensional. Parallelization and load balance issues of a reference implementation will be described in detail in future reports.

  13. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    NASA Astrophysics Data System (ADS)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing

  14. Automated design of hammerhead ribozymes and validation by targeting the PABPN1 gene transcript

    PubMed Central

    Kharma, Nawwaf; Varin, Luc; Abu-Baker, Aida; Ouellet, Jonathan; Najeh, Sabrine; Ehdaeivand, Mohammad-Reza; Belmonte, Gabriel; Ambri, Anas; Rouleau, Guy; Perreault, Jonathan

    2016-01-01

    We present a new publicly accessible web-service, RiboSoft, which implements a comprehensive hammerhead ribozyme design procedure. It accepts as input a target sequence (and some design parameters) then generates a set of ranked hammerhead ribozymes, which target the input sequence. This paper describes the implemented procedure, which takes into consideration multiple objectives leading to a multi-objective ranking of the computer-generated ribozymes. Many ribozymes were assayed and validated, including four ribozymes targeting the transcript of a disease-causing gene (a mutant version of PABPN1). These four ribozymes were successfully tested in vitro and in vivo, for their ability to cleave the targeted transcript. The wet-lab positive results of the test are presented here demonstrating the real-world potential of both hammerhead ribozymes and RiboSoft. RiboSoft is freely available at the website http://ribosoft.fungalgenomics.ca/ribosoft/. PMID:26527730

  15. Adaptive Modeling Language and Its Derivatives

    NASA Technical Reports Server (NTRS)

    Chemaly, Adel

    2006-01-01

    Adaptive Modeling Language (AML) is the underlying language of an object-oriented, multidisciplinary, knowledge-based engineering framework. AML offers an advanced modeling paradigm with an open architecture, enabling the automation of the entire product development cycle, integrating product configuration, design, analysis, visualization, production planning, inspection, and cost estimation.

  16. Building Adaptive Game-Based Learning Resources: The Integration of IMS Learning Design and

    ERIC Educational Resources Information Center

    Burgos, Daniel; Moreno-Ger, Pablo; Sierra, Jose Luis; Fernandez-Manjon, Baltasar; Specht, Marcus; Koper, Rob

    2008-01-01

    IMS Learning Design (IMS-LD) is a specification to create units of learning (UoLs), which express a certain pedagogical model or strategy (e.g., adaptive learning with games). However, the authoring process of a UoL remains difficult because of the lack of high-level authoring tools for IMS-LD, even more so when the focus is on specific topics,…

  17. Incremental learning for automated knowledge capture.

    SciTech Connect

    Benz, Zachary O.; Basilico, Justin Derrick; Davis, Warren Leon,; Dixon, Kevin R.; Jones, Brian S.; Martin, Nathaniel; Wendt, Jeremy Daniel

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  18. Automated Structure- and Sequence-Based Design of Proteins for High Bacterial Expression and Stability.

    PubMed

    Goldenzweig, Adi; Goldsmith, Moshe; Hill, Shannon E; Gertman, Or; Laurino, Paola; Ashani, Yacov; Dym, Orly; Unger, Tamar; Albeck, Shira; Prilusky, Jaime; Lieberman, Raquel L; Aharoni, Amir; Silman, Israel; Sussman, Joel L; Tawfik, Dan S; Fleishman, Sarel J

    2016-07-21

    Upon heterologous overexpression, many proteins misfold or aggregate, thus resulting in low functional yields. Human acetylcholinesterase (hAChE), an enzyme mediating synaptic transmission, is a typical case of a human protein that necessitates mammalian systems to obtain functional expression. We developed a computational strategy and designed an AChE variant bearing 51 mutations that improved core packing, surface polarity, and backbone rigidity. This variant expressed at ∼2,000-fold higher levels in E. coli compared to wild-type hAChE and exhibited 20°C higher thermostability with no change in enzymatic properties or in the active-site configuration as determined by crystallography. To demonstrate broad utility, we similarly designed four other human and bacterial proteins. Testing at most three designs per protein, we obtained enhanced stability and/or higher yields of soluble and active protein in E. coli. Our algorithm requires only a 3D structure and several dozen sequences of naturally occurring homologs, and is available at http://pross.weizmann.ac.il. PMID:27425410

  19. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  20. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering.

    PubMed

    Shanechi, Maryam M; Orsborn, Amy L; Carmena, Jose M

    2016-04-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain's behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user's motor intention during CLDA-a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter